Neurobiologically Realistic Determinants of Self-Organized Criticality in Networks of Spiking Neurons

Self-organized criticality refers to the spontaneous emergence of self-similar dynamics in complex systems poised between order and randomness. The presence of self-organized critical dynamics in the brain is theoretically appealing and is supported by recent neurophysiological studies. Despite this, the neurobiological determinants of these dynamics have not been previously sought. Here, we systematically examined the influence of such determinants in hierarchically modular networks of leaky integrate-and-fire neurons with spike-timing-dependent synaptic plasticity and axonal conduction delays. We characterized emergent dynamics in our networks by distributions of active neuronal ensemble modules (neuronal avalanches) and rigorously assessed these distributions for power-law scaling. We found that spike-timing-dependent synaptic plasticity enabled a rapid phase transition from random subcritical dynamics to ordered supercritical dynamics. Importantly, modular connectivity and low wiring cost broadened this transition, and enabled a regime indicative of self-organized criticality. The regime only occurred when modular connectivity, low wiring cost and synaptic plasticity were simultaneously present, and the regime was most evident when between-module connection density scaled as a power-law. The regime was robust to variations in other neurobiologically relevant parameters and favored systems with low external drive and strong internal interactions. Increases in system size and connectivity facilitated internal interactions, permitting reductions in external drive and facilitating convergence of postsynaptic-response magnitude and synaptic-plasticity learning rate parameter values towards neurobiologically realistic levels. We hence infer a novel association between self-organized critical neuronal dynamics and several neurobiologically realistic features of structural connectivity. The central role of these features in our model may reflect their importance for neuronal information processing.

1. Exact computation of subthreshold membrane potentials on discrete time points, in parallel.
2. Estimation of continuous spike times by interpolation between discrete time points. This estimation is especially important in our study, because STDP is highly dependent on precise spike timing.

Background: the integrate-and-fire model
In a network of integrate-and-fire neurons, the membrane potential for a neuron is given by where C is membrane capacitance and I leak , I syn , I ext are currents (Burkitt et al., 2006). I leak represents passive current leaks across the membrane and is given by where g is the membrane conductance, and E is the equilibrium potential.
I syn represents currents generated by synaptic inputs, and is given by where V 0 is a constant, w i is the weight of presynaptic neuron i, t i are the times of previous spikes of i (t i < t), and τ 1 and τ 2 are synaptic decay constants. I ext represents the external current and needs to be kept constant. Simulations with variable I ext are hence composed of multiple simulation segments, with constant I ext in each segment.

Exact subthreshold integration of the integrate-andfire model Exact integration without synaptic currents
Without synaptic currents, (1) is a one-dimensional system, By substituting which is solved exactly as Henceforth we consider the evolution of y 1 , noting that given our parameter values (Table 1), we may always revert to V via Eq. (3) shows that y 1 exponentially decays to zero from y 1 (0) < 0. Larger I ext make y 1 (0) more negative, and hence increase the decay rate (somewhat like stretching a spring increases the speed of its subsequent compression). It follows that the membrane potential V may only cross its spike threshold V thr , when the corresponding threshold y thr 1 = V thr − E − I ext /g is negative. For our parameter values (Table 1) this requires I ext > 0.18.
Given some initial condition y 1 (t) and a time step h, the constant P (h) = e −gh/C exactly computes y 1 (nh) for any integer n, Note that P (h) needs to be calculated only once.

Exact integration with synaptic currents
Incorporation of synaptic currents makes (1) three dimensional. We now introduce two further auxiliary variables, y 2 and y 3 , such that At the arrival of each spike from neuron i, y 2 and y 3 are modified as We may then restate (2) as We hence incorporate the effects of all previous spikes, without the need to store individual spike times (Brette et al., 2007). Eq. (6) is possible because Eq. (5) are both linear.
We may now restate (1) as and we may restate (7) and (5) in matrix form, where y represents the state vector of the system. The solution of (8) is equivalent to (3), where e At is known as the matrix exponential. As the exponential function is defined by its 'series expansion', e a = ∞ i=0 a n n! , so the matrix exponential function is defined by a corresponding series expansion, e A = which finally gives us is constant, needs to be calculated only once and, in symmetry with (4), computes y(nh) exactly for any integer n, y(nh) = P(h) n y(0).

Estimation of continuous spike times
Spikes occur between grid points, and are estimated by interpolation. For instance, consider a membrane potential V that has crossed its threshold V thr in the interval [t, t + h]. Linear interpolation would estimate the spike time t spike as .
Quadratic or cubic interpolation may also be used and may be more precise.
Estimations of spike times introduce a number of technical caveats, involving changes in the timing of the post-synaptic potential and emergence from refractory period. These technical issues are discussed in detail in Morrison et al. (2007), section 4.2.