|dc.description.abstracteng||A prudent approach to understanding the function of the cerebral cortex begins with understanding the repertoire of its dynamics. In this thesis, I study how interactions between single neuron properties, synaptic coupling, and connectivity produce the microstate stability and macrostate activity exhibited by models of cortical circuits. One aim was to tie these micro and macro levels of description together in the pursuit of understanding the collective behaviour. Another aim was to determine to what degree the collective behavior persists upon making the single neuron model less idealized and exhibit a more rich class of dynamics. I focused on purely inhibitory, random, balanced networks of spiking neurons, the most simple in silico network model of spiking neurons with which one can obtain the kind of asynchronous
and irregular activity thought to act as a base state in many cortical areas.
In the biologically relevant limit of fast action potential onset and fast synapses, the collective state of this dynamics exhibits stable chaos, where temporally irregular dynamics and stability to small perturbations coexist. Previous work had demonstrated the existence of an exotic phase space structure of flux tubes in such systems. Many answers regarding the mechanisms underlying the emergence of this structure, as well as its full geometry were however lacking. Also lacking was the analytical apparatus to exactly treat both the microstate stability and macrostate activity for neuron models with additional somatic or synaptic currents. For networks of Leaky Integrate-and-Fire (LIF) neurons, I present the empirical geometry of
a flux tube whose time-varying boundary is characterized by exponential decay to and irregular jumps away from the stable trajectory contained within it. A detailed analysis of the spiking microstate reveals the finite-size instability underlying the separation of flux tubes: perturbation-induced crossings of pre and postsynaptic spikes, which almost always decorrelate the microstate. Building on this analysis, I derive a host of analytical results explaining previous numerical observations: the near inevitability of a cascade of spike sequence changes following a single spike failure; the pseudo-Lyapunov exponent characterizing the divergence after such a perturbation; and the average cross section of the attractor basin making up the
phase space volume of a flux tube. I introduce and calculate the perturbation recall time, defined as the characteristic delay between the time of a perturbation and when its effects appear in the subsequent activity. Taken together, these results form the basis for a theory of stable chaos in spiking networks and for a theory of the balanced state that keeps track of each and every spike.
The means to extend such a theory were limited by the absence of methods to compute the microstate stability of networks of neurons with more than one dynamical degree of freedom. I present a semi-analytical framework based on machine-precise, event-driven simulations with which I realize methods to compute the full Lyapunov spectrum of a general 2D linear neuron model. Two notable limits of this model are the correlated LIF(cLIF) neuron, which exhibits a filtering synaptic current, and the Generalized Integrate-and-Fire(GIF) neuron, which introduces resonating subthreshold dynamics. Previous work showed that balanced cLIF networks can be chaotic for some finite value of the synaptic filtering timescale. With the presented methods, I characterize the Lyapunov spectrum of cLIF networks as a function of this timescale across the transition to, and deep into the chaotic regime. The critical value of the synaptic time constant is found to scale with the rate of spikes into a neuron. Applications of the ideas developed for the stable chaos theory in LIF networks are used to analytically estimated this scaling. The size of flux tubes are found to vanish characteristically approaching the critical value, reminiscent of a second-order phase
transition. A potential source of the instability responsible for the transition is found in the increasingly strong transient amplification exhibited by the single neuron dynamics. Many cortical circuits have inhibitory interneurons that exhibit resonance properties and qualitatively affect the dynamics of the population. Yet, this resonance is often not incorporated in models of cortical circuits and no expression for the response function of a resonating neuron valid across all values of the timescale of the intrinsic currents was known. To fill the gap, and motivated to understand the mutual dependencies between intrinsic frequency, voltage resonance, and population spiking resonance, I employ the Gaussian neuron approach for the calculation and analysis of the linear response function of the population firing rate for an
ensemble of GIF neurons. I find six distinct response types and use them to fully characterize the routes to resonance across all values of the relevant timescales. I find that resonance arises primarily due to slow adaptation with an intrinsic frequency acting to sharpen and adjust the location of the resonant peak. I determine the parameter regions for the existence of an intrinsic frequency and for subthreshold and spiking resonance, finding all possible intersections of the three. The expressions and analysis presented can facilitate the construction of an exact theory of correlations and stability of population activity in networks containing populations of resonator neurons.
Taken together, the results in this thesis provide both a theoretical foundation for the stable chaos observed in models of cortical circuits and for understanding how cellular properties, such as synaptic and intrinsic currents, contribute to the micro- and macroscopic activity and response properties of these balanced state models.||de