Skip to main content
Advertisement
  • Loading metrics

Interspike interval correlations in neuron models with adaptation and correlated noise

  • Lukas Ramlow ,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    lukas.ramlow@bccn-berlin.de

    Affiliations Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany, Physics Department, Humboldt University zu Berlin, Berlin, Germany

  • Benjamin Lindner

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliations Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany, Physics Department, Humboldt University zu Berlin, Berlin, Germany

Abstract

The generation of neural action potentials (spikes) is random but nevertheless may result in a rich statistical structure of the spike sequence. In particular, contrary to the popular renewal assumption of theoreticians, the intervals between adjacent spikes are often correlated. Experimentally, different patterns of interspike-interval correlations have been observed and computational studies have identified spike-frequency adaptation and correlated noise as the two main mechanisms that can lead to such correlations. Analytical studies have focused on the single cases of either correlated (colored) noise or adaptation currents in combination with uncorrelated (white) noise. For low-pass filtered noise or adaptation, the serial correlation coefficient can be approximated as a single geometric sequence of the lag between the intervals, providing an explanation for some of the experimentally observed patterns. Here we address the problem of interval correlations for a widely used class of models, multidimensional integrate-and-fire neurons subject to a combination of colored and white noise sources and a spike-triggered adaptation current. Assuming weak noise, we derive a simple formula for the serial correlation coefficient, a sum of two geometric sequences, which accounts for a large class of correlation patterns. The theory is confirmed by means of numerical simulations in a number of special cases including the leaky, quadratic, and generalized integrate-and-fire models with colored noise and spike-frequency adaptation. Furthermore we study the case in which the adaptation current and the colored noise share the same time scale, corresponding to a slow stochastic population of adaptation channels; we demonstrate that our theory can account for a nonmonotonic dependence of the correlation coefficient on the channel’s time scale. Another application of the theory is a neuron driven by network-noise-like fluctuations (green noise). We also discuss the range of validity of our weak-noise theory and show that by changing the relative strength of white and colored noise sources, we can change the sign of the correlation coefficient. Finally, we apply our theory to a conductance-based model which demonstrates its broad applicability.

Author summary

The elementary processing units in the central nervous system are neurons that transmit information by short electrical pulses, so called action potentials or spikes. The generation of the action potential is a random process that can be shaped by correlated fluctuations (colored noise) and by adaptation. A consequence of these two ubiquitous features is that the successive time intervals between spikes, the interspike intervals, are not independent but correlated. As these correlations can significantly improve information transmission and weak-signal detection, it is an important task to develop analytical approaches to these statistics for well-established computational models. Here we present a theory of interval correlations for a widely used class of integrate-and-fire models endowed with an adaptation mechanism and subject to correlated fluctuations. We demonstrate which patterns of interval correlations can be expected from the interplay of colored noise, adaptation and intrinsic nonlinear dynamics.

Introduction

Neural activity or spiking is a stochastic process due to the presence of multiple sources of noise, including thermal, channel, and synaptic noise [1]. The study of neural systems in terms of stochastic models is hence vital for understanding spontaneous neural activity as well as neural information processing. Particularly useful in this respect are integrate-and-fire (IF) models [24] because these models are often analytically tractable and thus permit insights into the interplay of noise, signals, and nonlinear neural dynamics. It should be also noted that they can mimic the neural response to in-vivo-like inputs for some cells surprisingly well [57] and there exist procedures to systematically map biophysically detailed conductance-based models to this model class (see e.g. [8]).

A common simplification in the study of neural spike generators lies in the assumption that times between subsequent spikes, the interspike intervals (ISIs), are statistically independent. Put differently, neural spiking is assumed to be a renewal process [9], which allows for a far-reaching theory of neural interactions in recurrent networks [10, 11]. We note that simple (one-variable) IF neurons, if driven by uncorrelated fluctuations, will exactly generate such a renewal spike train and for this reason lots of theoretical efforts have focussed on the problem of calculating the ISI probability density (statistics that completely characterizes a renewal process) [2, 12, 13].

Although renewal theory has been successful in describing some aspects of neural activity, there is increasing experimental evidence that in many cases ISIs are correlated over a few lags [1423]. These correlations are an important statistics of spike trains as they shape spectral measures and therefore have consequences for information transmission and signal detection [15, 16, 21, 2427].

Such correlations can be quantified by the serial correlation coefficient (SCC) (1) where Ti is the ith ISI, k represents the lag and 〈⋅〉 denotes the ensemble average. The SCC ρk measures whether two intervals’ deviation from the mean are on average proportional (ρk > 0), anti-proportional (ρk < 0) or independent of each other (ρk = 0).

Positive correlations can be induced by correlated input due to synaptic filtering [28, 29], slow network processes [3032] or channel noise with slow kinetics [33, 34]. Another mechanism, commonly associated with negative ISI correlations, exhibited by many neurons is spike-frequency adaptation, i.e. the increase of the ISI following an initial decrease due to a stimulation. Adaptation currents include calcium-gated potassium currents, M-Type currents as well as the slow recovery of sodium channels [35] (for the computational role of these and other neural adaptation mechanisms, see [36]). Typical time scales of these currents range from 50ms to 1s and can therefore by far exceed the mean ISI. Interestingly distinct causes of correlations can result from a single source: In neurons of the sensory periphery adaptation-channel noise, i.e. the stochastic opening and closing of slow ion channels that mediate an adaptation current, may dominate the spiking statistics and provide at the same time adaptation and correlated fluctuations [33, 34].

While both correlated input, in the form of colored noise, and adaptation and their implications for ISI correlations have been studied separately [3739], a general theory that allows to calculate the SCC in the presence of multiple correlation-inducing processes is still missing. In this article we extend the weak noise theory developed by Schwalger and Lindner [39] for mean-driven neurons to include multiple correlation-inducing processes. Our theory mainly applies to noisy neurons in the sensory periphery, in which the type of noise and the adaptation mechanisms are known. Cortical neurons, on the other hand, are more difficult: they typically operate in an excitable firing regime and the network noise that drives them is generally not known with respect to its statistics; below we discuss a special case of cortical firing that can nevertheless be captured by our theory.

We relate statistics of the spike train, namely the SCC ρk to intrinsic properties of nonlinear neural dynamics captured by the phase-response curve (PRC). The PRC measures the shift of the next spike time of a neuron subject to a small perturbation at different times in the firing cycle [4042]; see Fig 1 for an illustration of the method. The shape of the PRC depends crucially on the neuron type: type 1 neurons which bifurcate from a quiescent to a tonically firing regime via a saddle-node on invariant circle bifurcation possess a purely positive PRC, whereas type 2 neurons which undergo a a supercritical Hopf bifurcation have a partially negative PRC (see [8, 42] for more details on these neuron types). Our theory applies to both neuron types and predicts qualitatively novel patterns of interval correlations that deviate from a single geometric sequence; such deviations have been recently reported experimentally [43, 44].

thumbnail
Fig 1. Linear response of a neuron models spike timing.

The membrane potential of a tonically firing neuron with deterministic ISI T* is subject to an arbitrary perturbation u(ti + τ) (left panel, red line). Here ti is the time of a reference spike and τ ∈ [0, T*] is the relative time since the last spike, resembling a phase. How strongly this perturbation advances or delays the “phase” τ will in general depend on the “phase” itself at which the perturbation is applied. This sensitivity is quantified by the phase-response curve Z(τ) shown in blue in the middle panel. The term Z(τ)u(ti + τ) can thus be thought of a perturbation of the phase (middle panel, red line). In linear response these phase perturbations can be integrated to yield the cumulative phase shift or spike time derivation δTi+1 = ti+1 − (ti + T*) (right panel, red arrow). Note that separating the perturbation from the deterministic dynamics of the neuron model, i.e. finding u(ti + τ) is part of the problem that we address in this paper and solve in detail in the Methods section.

https://doi.org/10.1371/journal.pcbi.1009261.g001

The paper is organized as follows. We first introduce the broad class of models and the correlation measure of interest; we illustrate both by a special case, the quadratic IF model with adaptation and colored noise. In the following section, we present the general expression of the serial correlation coefficient in terms of the phase-response curve, adaptation kernel, and correlation function of the colored noise. We then explore the role of the specific shape of the phase-response curve on the SCC by considering integrator and resonator models with purely positive and partially negative PRC, respectively. We also discuss the case of a slow population of stochastic ion channels that can be approximated by our model [33]. Finally, we demonstrate that our theory can be applied to a conceptually different model, namely the conductance-based Traub-Miles model with an M current. We conclude our study with a discussion of the results in the context of neural information transmission and give an outlook to several open problems.

Results

Here we study a stochastic multidimensional integrate-and-fire neuron model with membrane potential v(t) and N auxiliary variables wj(t) that is subject to spike-triggered adaptation (variable a(t)) as well as correlated and uncorrelated Gaussian noise sources η(t) and ξv(t), respectively. (2a) (2b) (2c) (2d) We apply the usual fire-and-reset rule: when v(t) reaches a threshold vT, a spike is triggered at time ti = t; the membrane potential v and the auxiliary variables wj are instantaneously reset to v = vR and w = wR, respectively. In contrast to that, a(t) undergoes a jump by Δ/τa ≥ 0. In the absence of any noise we assume that the system approaches a limit cycle (dashed line in Fig 2B) with a fixed period T* and a unique value of the adaptation variable right after a spike, a*.

thumbnail
Fig 2. Serial correlation coefficients for the adaptive QIF (Theta) model with colored noise.

Panel A shows the transformed membrane potential θ(t) = 2tan−1v(t), adaptation current a(t) and colored noise η(t) with spike times {ti}, ISIs {Ti} and peak adaptation values {ai}. Panel B displays the deterministic limit cycle (dashed line) and exemplary noisy trajectory (solid line) in the phase plane (θ, a). Note that the jump is of constant size Δ/τa, while the voltage or equivalently phase always resets to a fixed value θR. Panel C depicts the corresponding type I PRC quantifying the QIF model’s response characteristics. For a non-adaptive QIF model the PRC would be symmetric around T*/2; for the adaptive QIF, however, the maximum is shifted towards the right, i.e. the neuron is particularly sensitive to stimuli applied at the end of the ISI. Panel D shows the SCC with initial, slightly positive correlation coefficient due to positively correlated noise and subsequent negative correlations governed by adaptation. This pattern cannot be described by a single geometric sequence. Parameters: μ = 5, τa = 6, Δ = 18, τη = 4, σ2 = 0.5, D = 0 and resulting T* ≈ 4.0 and coefficient of variation CV ≈ 0.2.

https://doi.org/10.1371/journal.pcbi.1009261.g002

Here we focus on the full stochastic system in which we use the uncorrelated noise sources ξv(t), ξη(t), independent zero-mean Gaussian white noise processes with 〈ξ(t)ξ(t′)〉 = δ(t′ − t). As a consequence, η(t) represents a temporally correlated (colored) Ornstein-Uhlenbeck (OU) process with auto-correlation function 〈η(t)η(t′)〉 = σ2exp(−|t′ − t|/τη), i.e. Eq (2d) is a Markovian embedding for low-pass filtered noise (for more general embedding of colored noise in IF neurons, see [45]). The presence of both white and colored noise will affect the voltage dynamics directly. However, there is also an indirect effect through noise-induced deviations of the adaptation variable from the deterministic limit cycle. As outlined in the method section, the combined effect of the direct and indirect perturbations on the next spike time is subsumed in the perturbation function u(t) that measures the deviation from the deterministic limit cycle and is shown exemplary in Fig 1 (for the detailed definition of u(t) see Eq (35)). This function typically attains both positive and negative values and carries memory about previous activity and stimuli.

The model is an extension of the one considered by [39]. The crucial novel feature is the addition of a colored noise process. Consequently, here we deal with two possible sources of interspike interval correlations—spike-frequency adaptation and slow fluctuations, as found for instance in the case of adaptation-channel noise [33, 34]. Note that the fixed reset of the voltage and auxiliary variable ensures that the spike-train is renewal in the absence of these two slow processes. In addition, a minor difference of the models is the scaling of the jump amplitude; in [39] the jump amplitude was Δ and not Δ/τa but the latter choice is the natural one for systems with adaptation-channel noise.

While our general theory does not impose any restriction on the dimensionality of the chosen IF model we will discuss only a selection of paradigmatic models with one auxiliary variable at most, namely the quadratic IF (N = 0), leaky IF (N = 0) and generalized IF (N = 1) model. An interesting special case, the adaptive quadratic integrate-and-fire model (QIF), is considered in Fig 2. It is a model without auxiliary variables (N = 0), a quadratic nonlinearity in the voltage dynamics f0(v) = v2, and threshold and reset points at infinity vT = −vR = ∞. The model is the normal form of a type I neuron for which the transition from the excitable regime to the tonically firing regime occurs via a saddle-node bifurcation [8, 42], implying a non-negative phase-response curve (PRC) [46]. Furthermore, the QIF model is equivalent to the Theta-neuron by the transformation θ = 2tan−1v and thus possesses identical statistical properties; in particular, they share identical ISI correlations.

The time courses θ(t), a(t) and η(t) and the limit cycle in the θ-a-plane are shown in Fig 2A and 2B, respectively. The central response characteristics of a tonically firing neuron, the PRC Z(t), is displayed in Fig 2C. Our theory, that is based on this PRC and detailed in the following, shows excellent agreement with the numerically simulated SCC (theoretical predictions and simulation results are shown in Fig 2D). The shown pattern is unlike any other one discussed in the theoretical literature on ISI correlations so far: very weak positive correlations ρ1 between adjacent ISIs and pronounced negative correlations at all higher lags k > 1. This is due to a non-trivial interplay between adaptation and colored noise. The observed shape of the correlations is only one of several distinct patterns that are possible in our model and explored in the following by means of our analytical approximations.

Generally and in line with our weak noise assumption we discuss cases where the noise intensity is rather small. However, in a later Section we explore the range of validity of our theory in terms of the output variability of the spike train and find quantitative agreement up to CV ≈ 0.2. Qualitatively, typical correlation patterns (i.e. SCC as a function of the lag) are well described for an even larger values of CV ≈ 0.7.

General expression for the correlation coefficient

As pointed out above, we assume that our model in the absence of noise operates in the tonically firing regime with deterministic period T* and that the spike train in the presence of noise is a stationary stochastic process. As shown in the Methods section, if the neuron is subject to a weak noise, the interspike intervals will be correlated according to the serial correlation coefficient (SCC) (3) with coefficients (4a) (4b) (4c) (4d) and deterministic peak adaptation value a* = (Δ/τa)/(1 − exp[−T*/τa]). Remarkably, the dependence on the lag k is carried exclusively by the two specific SCCs ρk,a and ρk,η. The prefactors though depend on properties of both the adaptation as well as the noise sources. The coefficient ρk,a describes the correlations in the case of adaptation and purely white noise (σ = 0). The second coefficient ρk,η represents the correlations in the absence of adaptation (Δ = 0) but with the combination of white and colored noise. The specific SCC ρk,a is identical to the one derived in [39]: (5) The SCC ρk,η is derived in the Methods section and reads (6) where . Interestingly, Eq (6) is the only place where the noise strength parameters D and σ2 enter and they do so as the ratio of noise intensities D/(τησ2). In this formulation it is simple to see that ρk,η vanishes for Dτησ2. In the opposite limit of vanishing white noise (D = 0), Eq (6) coincides with the expression derived in [47].

Our main result Eq (3) implies that the SCC for a general stochastic IF model with both adaptation and correlated noise is in fact a linear combination of two geometric series. These geometric series are determined by the two correlation-inducing processes and agree with the specific SCCs Eqs (5) and (6) except for a constant prefactor (constant with respect to the lag k). A sum of two geometric series as in Eq (3) can exhibit completely different patterns of interval correlations compared to a single geometric sequence obtained in previous theoretical calculations [33, 37, 38, 4749]. We recall that the absolute value of the elements of a geometric sequence sk = s0rk decay exponentially with the lag k for physically plausible values |r| < 1. In addition, the SCC’s sign is determined by the prefactor s0 and the sign between adjacent elements may alternate depending on the sign of the base r. The two possible signs of s0 and r allow for four distinct patterns of correlations in the case of a single geometric sequence.

Possible shapes that result from the interplay between the specific SCCs are illustrated in Fig 3. The pattern in Fig 3A for instance, is characterized by a very small positive first correlation coefficient (this could also be amplified or diminished by fine tuning parameters) whereas higher lags have pronounced negative correlations—a structure that cannot be generated by a single geometric sequence. In Fig 3B the inverse case is shown: a weak and negative first correlation coefficient followed by stronger positive coefficients at higher lags. Deviations from a single geometric sequence have been seen experimentally (see [44] for a recent example). Indirect evidence for the combination of short-term negative and long-term positive correlations have been reported by means of the Fano factor [5052] (see [21] for an explanation of the underlying connection between correlations and Fano factor).

thumbnail
Fig 3. General correlation coefficient ρk of the adaptive LIF model subject to white and colored noise.

The specific SCCs ρk,a and ρk,η are obtained by considering one correlation inducing process at a time, i.e. adaptation and white noise or colored and white noise, respectively. Two qualitatively different cases are displayed distinguished by i) the base pattern exhibited by ρk,a that is exponentially decaying in A and oscillatory in B and ii) the sign of the first (k = 1) and every subsequent (k ≥ 1) SCC. For example consider A where ρ1 > 0 and ρk < 0 for k > 0. The inverse case is shown in B, i.e. negative correlations at lag 1 and positive ISI correlation for every subsequent lag. Such patterns have been reported in cats peripheral auditory fibers and the weakly electric fish electroreceptors [50, 51] and can not be explained by adaptation or colored noise alone. Note that the SCC of the full model is not bound by the specific SCCs. Parameters (A, B): γ = 1, μ = (5, 20), τa = (2, 1), Δ = (2, 10), τη = (0.5, 5), σ2 = 2 ⋅ 10−2, D = 10−3.

https://doi.org/10.1371/journal.pcbi.1009261.g003

A closer inspection of Eqs (5) and (6) reveals that interval correlations introduced by the OU process lead to a coefficient ρk,η that can only decay exponentially with k because for the base of the power βk−1 we have the condition 0 < β < 1, according to Eq (4d). A richer repertoire, however, becomes possible for ρk,a because the base of the power can attain values from a broader interval −1 < αν < 1 (see Methods); note that α > 0 and hence the sign of the base is determined by ν. Thus oscillatory correlation coefficients enveloped by an exponential function emerge for a negative base, ν < 0. For ν > 0 the purely exponential case is recovered.

In addition to the base, also the prefactor can attain different signs which specifically depends on the neuron’s PRC. In the following sections we discuss patterns of interval correlations for two distinct cases, that is the leaky IF model with a non-negative and the generalized IF model with a partially negative PRC.

Adaptive leaky integrate-and-fire model with colored noise

For one-dimensional IF models, i.e. f0(v, w) = f(v) the PRC can be calculated analytically by means of the adjoint method Eq (17) (7) where the prime denotes the derivative with respect to v and Z(T*) = [f(vT) + μa* + Δ/τa)]−1 is the inverse velocity of the deterministic system at the threshold, see Eq (20). The term a* − Δ/τa corresponds to a(T*) right before the spike (recall that a* is the deterministic peak adaptation value right after the spike).

Since the neuron is required to fire in the absence of noise this velocity is positive and so is the PRC. Put differently, for every one-dimensional IF model, a positive kick in the voltage variable will always advance the phase as it brings the neuron model closer to the threshold.

For the adapting leaky IF model (f0(v) = −γv, N = 0) in particular, the PRC reads [39] (8) As discussed in the previous section the specific SCC ρk,a is a geometric sequence with oscillatory or exponential base pattern, distinguished by ν < 0 and ν > 0, respectively. Its prefactor (using |αν| < 1 and 0 < α < 1) depends specifically on the PRC and is always negative for positive PRCs. This is so because according to Eq (4d) for positive PRCs we find ν < 1. The second term ρk,η decays exponentially with lag k and possesses a non-negative prefactor because of the positive PRC of the considered model (this becomes evident in Eq (32)). To summarize, in a type I neuron model with non-negative PRC correlated noise leads to positive interval correlations with time constant equal to the correlation time of the noise. On the contrary, spike-triggered adaptation evokes a negative correlation between adjacent intervals (k = 1) followed by an exponential decay in amplitude at higher lags (k > 1) that can be either monotonic or oscillatory.

The relation between specific and general SCCs for the LIF model was already discussed in the preceding section; cf. Fig 3. In Fig 4 we inspect how the general correlation coefficient depends on the correlation time of the colored noise τη, here given in multiples of the unperturbed period T*. We distinguish the two possible cases of strong Fig 4A and weak adaptation Fig 4B in terms of the parameter ν that we can recast into ν = (f(vR) + μa*)Z(0) (derivation similar to [38], Appendix 4.1.2). Strong and weak adaptation are related to the sign of the parameter ν that appears for the noiseless system in the temporal derivative of the voltage at the reset . For weak adaptation (ν > 0) the voltage after reset runs on average towards the threshold. On the contrary, for strong adaptation (ν < 0) the peak adaptation value a* is so high that the voltage after being reset to vR drops on average even further towards more hyperpolarized values.

thumbnail
Fig 4. Pattern of interval correlations for the adaptive LIF model.

The PRC Z(τ) and SCC ρk for two different cases that are strong ν < 0 A and weak adaptation 0 < ν < 1 B are shown. In both cases the colored noise correlation time τη is gradually increased. For small correlation times the SCC is governed by the adaption as the colored noise becomes essentially white (dark line and circles). In the other limit of long correlation times the SCCs are positive and governed by the colored noise (light line and circles). For intermediate time scales the SCCs are determined by both processes equally as shown in Fig 3. Parameters (A, B): γ = 1, μ = (20, 5), τa = 2, Δ = (20, 2), σ2 = 0.1, D = 0 and resulting T* = (0.67, 1.04).

https://doi.org/10.1371/journal.pcbi.1009261.g004

For τηT* the noise becomes effectively white and consequently correlations are introduced solely by adaptation, i.e. the general SCC is reduced to ρk,a. In the other limit of long correlated noise with τηT*, the general SCC does coincide with ρk,η if there is no white noise present (D = 0). This implies that in the limit of long correlated noise the origin of positive ISI correlations (the colored noise) wins against the origin of negative ISI correlations (the adaptation), and consequently, ρk > 0. Why does the colored noise dominate in this limit? A long-range correlated noise η(t) can be regarded as a constant modulation of the input μ over many ISIs leading to similar deviations in adjacent intervals from the mean ISI. The adaptation acting on a finite scale τa will reduce these deviations in adjacent intervals, but can not change their common sign.

For intermediate values τηT* the SCCs can be governed by one process for small lags and the other for higher lags which is the case shown in Fig 4B. The first SCC is governed by positively correlated noise while the remaining SCCs are negative due to adaptation.

Adaptive generalized integrate-and-fire model with colored noise

In the two-dimensional case, as for instance for the generalized IF (GIF) model [53, 54] (9a) (9b) the PRC can be partially negative and resembles type II resetting. Positive kicks applied to the voltage at appropriate time instances can thus prolong the ISI. The PRC can be calculated analytically [39]: (10) where λ = γ + 1/τw, and w0(T*) is the deterministic value of the auxiliary variable w at the threshold. Having a second variable not only changes the PRC qualitatively but will also affects the deterministic period T* and, consequently, the parameters α and β. We would like to emphasize that the GIF model includes the LIF model as a limit case. For this reason we expect that all patterns shown by the LIF model can also be realized by the GIF model.

The observed patterns for ν < 0 and 0 < ν < 1, shown in Fig 5A and 5B, can also be realized by the LIF model and have been discussed in the preceding section (details depend on the specific parameter and model, though).

thumbnail
Fig 5. Pattern of interval correlations for the adaptive GIF model.

The PRC Z(τ) and SCC ρk for three different cases that corresponds to ν < 0 A, 0 < ν < 1 B and ν > 1 C are shown. The first two cases A, B resemble the previously discussed cases of the adaptive LIF model, see Fig 4. For the third case C both adaptation and correlated noise can have counter intuitive effects on the SCC if they act mainly on the proportion of the ISI where the PRC is negative. This can be ensured by appropriate choice of the time scales, here τaT*/2 and τηT*/2. Thus the adaptation can give rise to positive interval correlations and additional colored noise with varying correlation time initially decreases the SCCs for intermediate τη and eventually leads to enhanced positive correlations for large τη. Parameters (A, B, C): γ = (1, 1, −1), μ = (10, 20, 1), βw = (3, 1.5, 5), τw = (1.5, 1.5, 1.1), τa = (10, 10, 1), Δ = (10, 10, 2.3), σ2 = 10−3, D = 0 and resulting T* = (1.24, 0.57, 1.91).

https://doi.org/10.1371/journal.pcbi.1009261.g005

As a consequence of the partially negative PRC, the parameter ν can exceed one as seen from Eq (4d) and introduce positive correlations even if the driving noise is only shortly correlated. This is seen in Fig 5C where the correlation coefficients for short correlation times are positive. We recall that this case cannot be realized by an LIF model with adaptation and thus represents a novel feature of the GIF model. The involved dependence of ρ1 on the correlation time τη is presented in a different way in Fig 6A and contrasted with a similar case in the absence of adaptation in Fig 6B.

thumbnail
Fig 6. First serial correlation coefficient of the GIF model with respect to the correlated noise time constant τη.

Panel A shows the SCC for an adaptive GIF with parameters similar to those in Fig 5C. In panel B we consider a GIF model without adaptation and parameters chosen so that the PRCs in A and B qualitatively agree. The SCC at lag k = 1 can exhibit non-monotonic behavior with respect to the time constant τη due to the partially negative PRC. The PRC is shown in the upper left inset and is in both cases found to be negative until τT*/2. If the correlation time matches this proportion of the ISI the SCC is significantly decreased compared to the case of short correlation times. For τηT* adjacent ISIs are positively correlated as they are similarly affected by the slow varying noise. Parameters (A, B): γ = −1, μ = 1, βw = 5, τw = (1.1, 1.1), wR = (0, 1), τa = (1, 0), Δ = (2.3, 0), σ2 = 10−3, D = 0 and resulting T* = (1.91, 1.76).

https://doi.org/10.1371/journal.pcbi.1009261.g006

In order to understand the case ν > 1 shown in Fig 5C, first consider the effect of the adaptation separately from the colored noise. A shortened reference interval evokes a positive deviation of the peak adaptation value, δai = aia* affecting the next interval. If the corresponding inhibitory (negative) current [−δaiexp(−τ/τa), see Methods], acts mainly at the beginning of the next interval where the PRC is negative as well (for instance with τaT*/2 as in Fig 5C), it has a shortening effect on the subsequent interval. First and second interval are both shorter than the mean, implying a positive correlation; a similar line of arguments applies for a reference interval longer than the mean ISI. Now consider the additional effect of the correlated noise. First, short-range correlated noise (τη/T* = 10−2) is essentially white and does not introduce correlations. Therefore the SCC is governed by adaptation and remains positive by the mechanism discussed above. Secondly, consider larger correlation times still shorter than the proportion of the ISI for which the PRC is negative, e.g. τη/T* = 10−1. Values of η(t) that are preserved beyond a spike will then have opposite effects on the two intervals separated by the spike, inducing an anti-correlation of these intervals. As a consequence the overall SCC decreases initially with increasing τη. Finally, for correlation times equal or longer than the mean ISI, τηT*, the particular shape of the PRC becomes irrelevant and the positive correlations of the colored noise translate into positive correlations of the ISIs. The minimal correlation at intermediate values of the colored noise correlation time is demonstrated in Fig 6A. The asserted anti-correlation induced by a colored noise of intermediate correlation time is explicitly demonstrated in Fig 6B. Here we consider the GIF model without adaptation at parameters that ensure a negative PRC at short times. Clearly, the SCC is negative for intermediate values of the correlation time. This is an interesting result in its own right: A low-pass filtered noise can evoke negative correlations in a resonator model. Besides the combinations of white noise and spike-triggered adaptation [16, 55, 56], white noise and short term depression [47], and network noise from neurons firing more regular than a Poisson process [47], this is yet another independent mechanism for the generation of negative ISI correlations.

Leaky integrate-and-fire model with adaptation-channel noise

Adaptation and colored noise in our model Eq (2) can also be regarded as an idealized description of the current flowing through a population of stochastically opening and closing ion channels with adaptation-mediating voltage-dependent gating kinetics [33, 34]. A paradigmatic example is the Ca2+-dependent K+ current [55, 57] but several other candidates for such spike-triggered adaptation currents are known (see [35]). Whatever the type of channel is, the current through a single channel is highly stochastic and this remains true also for the summed current through a finite population of channels—this is what is commonly referred to as channel noise. As was demonstrated in [33], the total current through a finite population of adaptation channels can be split up into a deterministic part (equivalent to the adaptation dynamics Eq (2c)) and a stochastic part that corresponds to an Ornstein-Uhlenbeck process (our Eq (2d)).

In this interpretation of the model, the summed current a(t) + η(t) stems from one source, i.e. from the population of adaptation channels and we regard the sum as adaptation-channel noise (the white noise may be regarded as resulting from faster, e.g. Na+ channels). Crucially, because noise and adaptation have a common origin, the previously independent time constants τa and τη have to be set equal. We will refer to the common time constant as τcτa = τη. Note that consequently we have α = β, parameters which where defined in Eq (4d). We emphasize that we will not consider explicit channel models here but refer the interested reader to [33, 34].

First note that our expression for the general SCC Eq (3) simplifies considerably if τa = τη since in this case the second term (B/C)ρk,η drops out. This is so because with α = β the prefactor B = 0, see Eq (4b). The possible interval correlations are thus determined by a single geometric sequence and comprise exponentially decaying or oscillatory patterns.

Quantitative agreement between simulations (circles) and theory (lines) for the first correlation coefficient is demonstrated in Fig 7 for the cases of weak (Fig 7A) and strong adaptation (Fig 7B) and varying intensities of white noise (black lines). We also compare to the limiting cases of vanishing colored fluctuations (σ = 0, blue dashed line) and vanishing adaptation (Δ = 0, orange dash-dotted line).

thumbnail
Fig 7. Serial correlation coefficients for the LIF with adaptation-channel noise.

The SCC of adjacent intervals ρ1 (black lines and dots) for our model Eq (2) with identical time constants τη = τa = τc shows non-monotonic behavior with a minimum as a function of τc given that the white noise intensity D is sufficiently large. This is so because the kick amplitude scales with . The panels A and B correspond to weak and strong adaptation, respectively. The limiting cases are shown in orange (no adaptation or white noise, Δ = 0, D = 0) and blue (no colored noise, σ = 0, D = 0.1). However, only the limit case of vanishing colored noise can be attained by the full model through varying the white noise intensity. Parameters: (A, B) γ = 1 μ = (5, 20), Δ = (2, 20), σ2 = 0.1. Note that in contrast to the Figs 46, the deterministic period T* depends for some of the curves on τc. For the black and blue curves the deterministic period T* ∈ [0.56, 0.67] in A and T* ∈ [0.74, 1.08] in B and increases in both panels with τc. For the orange curve T* = 0.22 in A and T* = 0.05 in B.

https://doi.org/10.1371/journal.pcbi.1009261.g007

We recover a number of results known from the literature. First of all, if the total noise in the system is dominated by the uncorrelated fluctuations (D is sufficiently large), the SCC is negative and the absolute value is maximized if the time constant of the adaptation is about the mean ISI [33, 55]. Secondly, as already argued above and in line with results from [33] the correlations are always positive if τc is sufficiently large and the white noise is sufficiently weak, i.e. the stochasticity of the adaptation (described by the colored noise) wins against the feedback effect of the adaptation in determining the sign of the correlation coefficient. Finally, we find qualitative agreement of ρ1(τc) in our model with the channel model of Ref. [33]: it exhibits a non-monotonic shape with small negative correlations for small τc and positive correlations for larger values of τc (cf. solid line for D = 0.05 in Fig 7A, with empty circles in Figure 9 of Ref. [33]).

We would like to emphasize that for the channel noise case we have explicitly taken into account additional white noise (we used D = 0 for the previous cases). As it becomes evident from Fig 7, this white noise can change the sign of the correlation coefficient and, more generally, the way the first SCC depends on the time constant of the channel kinetics. This illustrates how different channel fluctuations may interact to shape the serial correlation coefficient of the interspike interval.

Adaptive leaky integrate-and-fire model with network-noise-like fluctuations

So far we have discussed neuron models that are subject to positively correlated noise as it would arise due to synaptic filtering of uncorrelated pre-synaptic spike trains or due to slow adaptation channels. Such input noise processes exhibit power spectra with increased power at low frequencies (or, depending on the perspective, reduced power at high frequencies). This has clear applications to neurons in the sensory periphery that often lack synaptic input (spike variability is mainly caused by channel noise) or are only subject to approximately uncorrelated feedforward spike input. However, what about the interesting case of cortical neurons as part of a recurrent network of neurons? Although our theory requires a mean-driven cell and is therefore not generally applicable to this situation (many neurons in the cortex operate in an excitable, i.e. fluctuation-driven, regime), we discuss now a special case in which it nevertheless can be employed.

Most neural networks show a large heterogeneity with respect to cellular parameters as well as to the kind, number, and strength of synaptic connections, resulting in distributions of firing rates and CVs with the latter measure of variability varying between 0.2 and 1.5 (see e.g. [58]). For cells that fire rather regular and with high rates we can assume that they are effectively mean driven (with respect to the sum of intrinsic and recurrent currents); although a low value of the CV is in principle also possible for an excitable neuron through the mechanism of coherence resonance [59], this requires a close proximity to the bifurcation point and a fine tuning of the noise intensity that is unlikely to take place in the mentioned cells. The same argument can be made for certain areas in the brain, as for instance the motor cortex, where the firing variability is generally lower (see e.g. the review [60] or a more recent study on the variability in motor cortex [61]).

We consider such a mean-driven neuron that receives input from a recurrent network in an asynchronous irregular state (the neuron could be part of this network or be subject to a feedforward input from such a network). The kind of network noise can be approximated by a Gaussian process; its temporal correlation function can attain different shapes and depends on the detailed connectivity in the network (see e.g. [62, 63]). In typical cases of low to intermediate firing rate one encounters both in in vivo experiments [64] and in theoretical studies [45, 62, 65] fluctuations that are referred to as green noise, the power spectrum of which possesses reduced power at low frequencies and is otherwise flat. Such a power spectrum is well approximated by a sum of a white noise and an Ornstein-Uhlenbeck process (see e.g. Figure 14 in [45]) if the previously independent noise sources in Eq (2a) and (2d) are chosen to be anti-correlated ξv = −ξη. The respective power spectrum of the random process is given by (11) This power spectrum possesses a constant high frequency limit limω→∞ S(ω) = 2D and reduced power at low frequencies , see Fig 8A. The turning point between those two limits is given by ω = 1/τη. Furthermore, numerical inspection of the self-consistent network spectrum in [45] revealed that the parameter τη is mainly set by the mean firing rate r0 of the neurons in the recurrent network, or, equivalently, to their mean ISI 〈Tnet = 1/r0, and we have roughly τη ≈ 〈Tnet/π. The parameters D and σ2 are determined by the number, type, and strength of synaptic connections, see [45].

thumbnail
Fig 8. Serial correlation coefficients for the LIF with network-noise-like fluctuations.

Panel A shows the input power spectrum Sin with reduced power at low frequencies of a random process that is applied to a LIF model. The resulting SCC ρ1 for no adaptation (upper line) and weak adaptation (lower line) are displayed in panel B. The coefficient ρ1 has a minimum because for both long and short correlation times the noise becomes essentially white. The pattern of interval correlations are shown in C for no and D for weak adaptation. Even in the absence of an adaptation current the network-noise can generate negative ISI correlations. This is due to the lack of power at low frequency. For an adaptive LIF model negative correlations are enhanced. Parameters (C, D): γ = 1, μ = 5, τa = (0, 2), Δ = (0, 2), D = 0.18, τησ2 = 4.5 ⋅ 10−2 and resulting T* = (0.22, 0.67).

https://doi.org/10.1371/journal.pcbi.1009261.g008

It turns out that such a power spectrum can be treated by our theory if we adjust the specific correlation coefficient ρk,η as follows (12) This is exactly the approximation for the SCC that has been derived for purely colored-noise driven neurons in [47]. For our model with an adaptation variable, this generalization is valid if the corresponding autocorrelation function Cζζ(τ) decays exponentially (an additional delta-function is also permitted), which is the case for the considered green noise.

In Fig 8 we consider a mean-driven LIF neuron with (Fig 8C) or without adaptation (Fig 8D), which is subject to network-noise-like green fluctuations with spectrum Sζζ, see Fig 8A. This spectrum exhibits a low-frequency power suppression of Sζζ(ω = 0)/Sζζ(ω → ∞) = 0.25, similar to that of the self-consistent power spectrum in Ref. [45], Fig 14. We choose the overall amplitude of the noise such that the resulting CV is in between 0.2 and 0.5, which is in the lower physiological range for cortical cells. We compare the theoretical SCC (lines) to stochastic simulations for different values of the time scale τη and find in all cases a good agreement; this becomes an excellent agreement for weaker noise, as can be expected. Remarkably, even in the absence of adaptation, a green noise evokes negative ISI correlations, see Fig 8C; a corresponding observation has been made for another noise process with reduced low-frequency power in [47] and also the excitable (non-adapting) cells in the recurrent network in Ref. [45] exhibit a (somewhat smaller) negative correlation of ρ1 ≈ −0.1 [66]. With an additional adaptation current (Fig 8D), negative ISI correlations become even stronger as can be expected. Furthermore, in both cases the SCCs depend non-monotonically on τη, see Fig 8B because, somewhat non-intuitive, in both limits τη → 0 and τη → ∞ the effective noise becomes white (uncorrelated) and will not cause negative correlations anymore. We note that for the network situation with a mean-driven cell with mean ISI T* firing somewhat faster than the average cell with ISI 〈Tnet, a time constant , e.g. a ratio of τη/T* = 1 seems to be the most relevant value. Interestingly, this is close to the value that maximizes the strength of correlations, cf. Fig 8B.

Adaptive generalized integrate-and-fire model with both colored and white noise—Testing the range of validity

We turn to the most general case and discuss to what extend it can be expected that the SCC of a simulated stochastic spike train is well described by our perturbation theory. We choose a specific parameter set with respect to the deterministic system and the involved time scales and vary the two small parameters of our theory, i.e. the white noise intensity D and variance of the colored noise σ2. In order to inspect the range of validity, we show not only the SCC ρ1 but also a popular measure of the output variability, the coefficient of variation (CV)

In a first simulation setup the variance σ2 is fixed so that for small values of D the SCC ρ1 is predominantly determined by the colored noise (cf. ρ1 > 0 in Fig 9A bottom for small D). Increasing the white noise intensity has a twofold effect. First, it boosts the output variability of the spike train (cf. growth of the CV in Fig 9A top). Secondly, with stronger white noise the adaptation becomes the dominant process in shaping the SCC (ρ1 < 0) as demonstrated in Fig 9A bottom for large D. Put differently, the sign of the SCC can be determined by the ratio of the noise intensities D/(τησ2), which can be seen from Eq (6), the only place where the noise intensities enter in our theory.

thumbnail
Fig 9. Range of validity and effect of varying noise strengths.

Coefficient of variation CV (top) and SCC of adjacent intervals ρ1 (bottom) for the GIF model with weak adaptation. We test to which extent simulation results are well described by our weak-noise theory with respect to the two small parameters (D, σ2). In A the variance of the colored noise is fixed (σ2 = 0.1) and the white noise intensity D is varied; in B we fix D = 0.01 and vary σ2. We find quantitative agreement for CV < 0.3 in A and CV < 0.15 in B. In both cases qualitative agreement in terms of the shape of the SCCs (see insets) is found over the whole range of D and σ2, respectively. Parameters: γ = 1, μ = 20, βw = 1.5, τw = 1.5, τa = 10, Δ = 10, τη = 1 and resulting T* = 0.57.

https://doi.org/10.1371/journal.pcbi.1009261.g009

We find quantitative agreement between theory and simulation for both the CV and SCC ρ1 up to CV = 0.3. Qualitative agreement is maintained over the whole tested range of noise intensities D; even for a relatively high CV, e.g. CV ≈ 0.7, the dependence of the SCC on the lag k is well described by our theory (see inset Fig 9A).

In a second setup we fix the white noise intensity D and test how varying the colored-noise variance affects the agreement between simulation and theory.

Here we find quantitative confirmation of our theory up to CV = 0.15. Again, theory and simulations agree qualitatively over the whole range tested as demonstrated in the inset of Fig 9B. Specifically, focusing on the SCC’s dependence on the lag k the theory reproduces the change in sign between ρ1 and ρ2, the minimum at k = 3 and the subsequent decay.

Traub-Miles model with an M current and both colored and white noise

Finally, we demonstrate that our theory can be applied beyond the integrate-and-fire framework to a Hodgkin-Huxley-like conductance-based neuron. Specifically, we use the Traub-Miles model endowed with a slow adaptation-like potassium current as considered by Ermentrout [67] and drive it with both colored and white (notation as above) (13a) (13b) Here I is a constant current and Iion comprises the fast sodium, leak and potassium currents, given in terms of the reversal potentials Ey, the maximum conductances gy and the gating variables h, n and m (14a) (14b) spike-frequency adaptations is mediated by a slow potassium current (15a) (15b) For ax(V), bx(V), h(V) and the parameter values, see Table 1.

thumbnail
Table 1. Simulation parameters for the Traub-Miles model.

https://doi.org/10.1371/journal.pcbi.1009261.t001

The transient behavior of V(t) and z(t) in response to a current step, see Fig 10A, displays spike-frequency adaptation (note the increase in the interspike intervals in the course of time). The noisy trajectory of the full system is close to the deterministic limit cycle for a sustained constant input, Fig 10B. Using the deterministic model (D = 0, σ = 0) and small short pulses, we can determine the PRC of the system according to Eq (16) numerically, see Fig 11A.

thumbnail
Fig 10. Traub-Miles model with slow M current.

Panel A shows the membrane potential V(t) and the adaptation’s gating variable z(t). At t = 25ms a constant current I = 5μA/cm2 (red line panel) is applied so that the model undergoes a transition from the excitable to the tonic firing regime. Due to the slow build-up of the adaptation current the model shows a transient behavior where the firing-rate decreases until z(t) has reached its stationary value (doted line). The inset shows that z has two different phases, one during which z rapidly increases and another where z slowly decays. Panel B shows the deterministic limit cycle (dashed line) together with a noise trajectory of the tonically firing model with T* = 18.9ms. Parameters are as given in Table 1.

https://doi.org/10.1371/journal.pcbi.1009261.g010

thumbnail
Fig 11. Serial correlation coefficients for the Traub-Miles model with slow M current.

Panel A shows the Phase-response curve Z(τ) as well as the deterministic time-course of the membrane potential V(t) from which the PRC was obtained. Note that the PRC is always positive wherefore we expect to find SCCs that are qualitatively similar to those obtained from a LIF model. Panel B shows a comparison between numerically obtained and theoretically calculated SCCs. As in Fig 3 we compare specific and general SCCs ρk,a, ρk,η and ρk and find the same patterns as for a LIF model with likewise purely positive PRC.

https://doi.org/10.1371/journal.pcbi.1009261.g011

In addition to the PRC our theory requires the knowledge of the deterministic interspike interval as well as the peak-value and the time-scale of the adaptation current. The latter time-scale is readily identified as the time-constant of the gating variable z, τa = τz. The former parameters can be determined numerically from simulations of the deterministic model; we find T* = 18.9ms and , which correspond for the chosen parameters to a slow and weak adaptation.

Consistent with our discussion surrounding Fig 3, we inspect the SCC in three different cases: i) with adaptation and white noise, ρk,a ii) with colored and white noise but no adaptation, ρk,η and iii) with both colored and white noise and adaptation present, ρk.

The resulting SCCs, see Fig 11B, show very good agreement between simulations and theory and illustrates that our theory is applicable to conductance-based models. A comparison with Fig 3A demonstrates that the type of the PRC is indeed the essential response characteristics that determines the pattern of interspike-interval correlations.

Discussion and conclusion

Neurons often show both stochasticity as well as slow time scales in their spiking process, features that are due to intrinsic and external noise sources ´ [4, 13] and due to adaptation currents [35], respectively. The two essential characteristics of neural firing, randomness and adaptation, do not only become apparent in the spontaneous activity of nerve cells but also influence strongly their signal transmission properties (when stimulated with time-dependent signals) [6778] and their synchronization with other cells (when considered in large recurrent networks) [7982]. It is thus an important goal in neuroscience to theoretically understand spiking models of the nerve cell that incorporate these features.

Multidimensional stochastic integrate-and-fire models endowed with adaptation currents and intrinsic noise are simplified yet biophysically minimalistic descriptions that incorporate both stochasticity and spike-frequency adaptation. It has been shown that they can mimic the response of pyramidal cells to complex stimuli to an astonishing degree of accuracy [5, 8385]. Thus, not surprisingly, many theoretical efforts have been devoted to this model class, aiming to calculate the firing rate, the stationary voltage distribution, or the spike-train power spectrum. The most general theory of these statistics uses the framework of a multidimensional Fokker-Planck equation [45, 86, 87] that, however, permits in most cases only numerical solutions that do not permit simple conclusions on how certain parameters shape the SCC.

A striking effect of both adaptation currents and realistic (i.e. temporally correlated) noise sources is that the interspike intervals will be not independent of each other anymore. It is easy to understand that, for instance, a slow noise (resulting from low-pass filtered synaptic noise or from a channel population with slow kinetics) will correlate adjacent intervals. Adaptation combined with fast fluctuations, lead on the contrary to negative correlations, i.e. a short ISI is on average followed by a longer one. Often, the spike generation is more complicated, as in a resonator model [39, 53]; the noise may have fast and slow components or its power may be even concentrated in a narrow frequency band [88]; slow noise and slow adaptation may result from the same source (stochastic adaptation ion channels [33]). In all these more complicated cases, we need a theory in order to predict and interpret the sign and, more generally, the patterns of interval correlations. Special cases have been attacked with different methods, assuming a slow noise [48, 89], a weak noise [38, 39, 47, 49, 88], weak adaptation [37, 90], or a discrete Markov-state description [91, 92]. The most important problem of ISI correlations evoked by a combination of both adaptation and colored noise has not been addressed yet.

Here we made an important step towards a general theory of interspike-interval correlations in stochastic neurons. We derived a general formula for the serial correlation coefficient of a multidimensional IF model subject to both a spike-triggered adaptation currents and correlated noise. Two important assumptions of our theory are that i) the neuron is (if noise is switched off) in a tonically firing regime; ii) the stimulating noise is weak. We tested this formula in several situations corresponding to special cases of our general multidimensional integrate-and-fire model and we applied the theory even to a conductance-based model with adaptation. In all cases we found an excellent agreement with the theory, demonstrating that especially the second assumption is not a strong limitation. We will return to the limitations of our model below.

Our theory draws heavily on previous approaches that used the phase-response curve to study stochastic neurons [39, 47, 82]. Generalizing these methods, we arrived at a qualitatively novel result: the serial correlation coefficient as a function of the lag between the two intervals is not limited to a single geometric sequence but, if both spike-triggered adaptation and low-pass filtered noise are present, can be expressed by a sum of two geometric sequence, one corresponding to the adaptation part and one to the colored noise. Because of the nonlinear feedback nature of the adaptation and because the SCC is the ratio of two statistics (covariance and variance of intervals), this is a highly nontrivial but very useful finding. The structure of this new solution allows to explain serial correlations that have been observed experimentally but could not be explained so far theoretically. Note that this does not concern the case of a temporally structured input noise. For instance, if a neuron is driven by a narrow-band noise with power in a preferred frequency band, or a power-law noise with power distributed over a broad frequency band, the SCC adopts some of this input’s temporal structure and can be more complicated [47, 88].

Correlations in the interval sequence of a spiking neuron are certainly interesting on their own: we see a complex biophysical system far from thermodynamic equilibrium that generates pulse sequences with rich statistics, very different in nature from the text book example of a Poisson process or other renewal processes that would not show any interval correlations at all. The nonlinear model with colored noise and feedback poses an important problem for theoreticians in the field of computational neuroscience. Still one may wonder why we should invest so much effort in the (approximate) calculation of this particular second-order statistics. However, besides the basic understanding of spontaneous neural firing, there are at least three more reasons to explore ISI correlations, outlined below in more detail.

First of all, having an analytical expression for the ISI correlations in terms of biophysical parameters, may allow us to extract some of these parameters from measured correlation coefficients. That it is principally possible to extract otherwise inaccessible parameters from experimentally measured ISI correlation coefficients, has been demonstrated in Ref. [88] for electrosensory cells subject to narrowband noise. For the more involved case of an adapting neuron, it might be recommendable to use not only ISI correlations but also other statistics (spike-train power spectra, for instance) to extract model parameters. Since the correlation coefficient depends nontrivially also on simple stimulation parameters, injecting a (temporally constant) current, varying its amplitude, and comparing the resulting variation in the SCC with the theoretical formula might provide another way to access parameters.

Secondly, interval correlations in the spontaneous activity can have a strong impact on the transmission of signals. This is clearly seen in the power spectrum of the spike train, the low-frequency limit of which is directly related to the sum over the ISI correlation coefficients [93] This spectrum of the spontaneous activity plays the role of a background spectrum if a stimulus is present. Having a reduced background spectrum at low frequencies (due to negative ISI correlations) may enhance the signal-to-noise ratio in this frequency band (but also diminish it in other bands) [24, 25]; positive correlations, on the contrary, may diminish the mutual information about a broadband stimulus [94]. A similar argument for the beneficial effect of negative ISI correlations can be made for a signal detection task [15] by means of the asymptotic Fano factor F(T) = 〈(N(T) − 〈N(T))2/〈N(T)〉 (where N(T) is the spike count in the time window [0, T]), that is simply related to the low-frequency limit of the power spectrum stated above by limT→∞ F(T) = S(0)/r0. For the transmission of time-dependent signals, interval correlations are a way to control the noise spectrum and thus to shape the information transmission in a frequency-dependent manner, contributing to what is known as information filtering [26]. In all these cases, our theory describes how exactly the cellular dynamics (PRC), the adaptation (strength and time scale) and the noise (strength and correlation time) affect the serial correlations and by them the signal transmission properties of the respective neuron.

Thirdly, we may consider ISI correlations for neurons connected in the large recurrent networks of the brain. In networks correlations can be evoked by the slow or oscillatory (narrow-band) noise emerging from the nonlinear interactions of neurons [27, 3032, 95], by deviations of presynaptic spike statistics from Poisson statistics [47, 62], but also by synaptic short-term plasticity [47]. As outlined above for the case of a single noisy neuron, such ISI correlations can likewise affect the signal transmission properties of whole populations [27]. Moreover, the specific strength and pattern of ISI correlations may be informative about properties of the network (again in combination with other statistics). Although our theory cannot be readily applied to cortical neurons (many cells are excitable and the driving fluctuations are strong), it can be used in certain special situations, as we have demonstrated here.

Turning back to the theoretical challenges and achievements of our paper, we would like to finally discuss its limitations and possible directions of future research to go beyond the results achieved here. Although our analytical approach applies to a broad class of models and is not limited with respect to the time scales of adaptation or colored noise, it is restricted to i) neurons in the tonically firing regime and ii) weak input noise. We had to exclude cases in which neurons already in the deterministic case fire spike trains with complex patterns [96]. More restrictive from our point of view is that we had to exclude excitable neurons and neurons that are subject to strong stochastic input such as emerging from synaptic background noise in recurrent neural networks. Of course, these cases are of particular importance for the stochastic dynamics of cortical cells and they can be addressed in the Fokker-Planck framework as has been demonstrated for important spike-train statistics in full generality in [45]. Cases in which strong noise impinges on a mean-driven neuron may be also addressed by taking into account amplitude dynamics [97, 98] and higher order phase-response functions.

However, the PRC techniques used in our work might turn out to be applicable also in these cases based on generalizations of the notion of phase to the situation of strongly stochastic and excitable neurons. Indeed, two attempts have been made over the last couple of years to generalize the phase concept for deterministic systems to stochastic oscillators: the phase can be either defined in terms of the mean-first-passage time [99101] (see in particular the recent analytical approach to the problem in Ref. [102]) or in terms of the asymptotic evolution of the probability density [103] (see specifically the analytically tractable case of a multidimensional Ornstein-Uhlenbeck process addressed in Ref. [104]). It is yet neither clear which of the two phases is more appropriate for a given system [105, 106] nor how to generalize the concept of a phase-response curve to this case. The success of the PRC method in the limit of weakly perturbed deterministic systems demonstrated here and in previous studies [39, 47, 82] should be a strong incentive to pursue this line of research.

Methods

Phase response curve

The phase-response curve (PRC) provides a method to calculate the phase shift of a nonlinear oscillator in (linear) response to a perturbation applied at specific phases of the limit cycle. The phase of the unperturbed system is defined with respect to some event, reoccurring at a specific phase. An interesting subclass of nonlinear oscillators to which this theory can be applied are tonically firing integrate-and-fire (IF) neurons subject to weak noise, as considered here. The phase τ of the oscillator is then defined with respect to the spiking event that, in the deterministic case, occurs with period T* at times ti−1, ti, ti+1 and so on. This period can be normalized to 1 or 2π depending on the context. However, here we interpret τ ∈ [0, T*] as a relative time since the last spike and abstain from normalizing the phase.

In order to calculate the PRC, consider an IF model that is subject to a weak perturbation that instantaneously shifts the voltage variable v by some small ϵ at time τ. This can be realized by applying a delta kick to the dynamics of the neuron model (with a reference spike at time ti−1). The PRC Z(τ) measures the shift of the subsequent spike time ti(τ, ϵ) = ti−1 + Ti(τ, ϵ) or equivalently deviation of the corresponding ISI δTi(τ, ϵ) = Ti(τ, ϵ) − T* due to that delta kick applied at “phase” τ (16) The sign on the right hand side is chosen so that a positive PRC is obtained if a positive kicks to the voltage variable ϵ > 0 leads to a shortened ISI δT < 0.

Adjoint method.

A useful approach to calculate the PRC analytically is provided by the adjoint method. The PRC satisfies the adjoint equation [40, 42] (17) where Z(t) is a set of N + 2 functions (18) Each component quantifies the linear response of the spike time to a perturbation of the limit cycle in the corresponding variable of the model. In particular, this implies that we can also calculate the phase-response with respect to perturbations of the auxiliary or adaptation variables. The matrix A(t) is the Jacobian of the considered IF model evaluated at the T*-periodic limit cycle solution X0(t) = [v0(t), w0(t), a0(t)]T (19) with the normalization (20) and the end (instead of initial) conditions (see [39]) (21) which, together with Eq (20), implies the condition (22) Solving the above differential equation for Z(t) with these conditions either analytically (in simple cases) or, for the general case, numerically, provides the PRC with respect to perturbations in the voltage, which is the first component of the vector Z(t).

Assumptions for the analytical approximations

We require that the deterministic neuron model exhibits a unique limit-cycle solution with a finite period T*, i.e. the neuron is tonically firing in the absence of noise. Furthermore, following [39] we assume that the effect of the noise sources is weak such that the deviation of the ISI from the deterministic interval δTi = TiT* is small and can be described by the PRC (see above). First of all, this weak-noise condition allows to approximate the average by the deterministic ISI 〈Ti〉 ≈ T*. More rigorous approaches to the mean ISI (or, equivalently, to the mean first-passage time of the voltage going from reset to threshold values) have been pursued for white [107, 108] and colored [29, 48, 109, 110] noise, but are not considered here.

Although in the weak noise limit the mean deviation of the ISI vanishes, correlations between individual ISI deviations do not and the SCC can be approximated by (23) This is indeed only an approximation due to the assumption that the deviation from the mean is equal to the deviation from the deterministic interval, Ti − 〈Ti〉 ≈ TiT* = δTi, which is of course implied by the afore mentioned assumption that the noise does not change the mean ISI.

In the succeeding sections we calculate the SCC for a general nonlinear stochastic IF model with spike-triggered adaptation and driven by a combination of white and colored noise. It simplifies the analytical treatment to first address the special case of a neuron without adaptation but with white and colored noise, which is what we do in the next subsection.

Correlation coefficient for IF models without adaptation

We first demonstrate how the SCC for a non-adaptive stochastic IF model driven by correlated and white noise can be calculated, i.e. in a first step we explicitly exclude the adaptation which can be easily achieved by setting Δ = 0.

Deviations from the mean ISI can be calculated via the PRC Z(τ), which quantifies how a small displacement of the voltage variable v due to a perturbation ui(τ) = ϵδ(ττ0), applied a specific “phase” τ0 ∈ [0, T*] after the last spike time ti−1, advances (Z(τ0)ui(τ0) > 0) or delays (Z(τ0)ui(τ0) < 0) the next spike time ti. In general, a perturbation will affect the ISI deviation over the entire time window (see Fig 1) according to (24) The PRC Z(τ) of the deterministic system (D = σ = 0) is obtained by the adjoint method or numerically, as described above, while the perturbation follows immediately from Eq (2) (25) Hence, the deterministic limit cycle is perturbed by two independent processes: a weak OU process η and Gaussian white noise ξv. The deviation of the i-th interval is then given by (26) from which we define two random numbers given by the integrals of the weighted noises acting over the i-th interval (27) (28) Taking products of deviations of intervals that are lagged by an integer k, permits to calculate the covariance of intervals needed in the Eq (24) for the SCC: (29) (30) (31) where we approximated ti+ktikT* and used the correlation functions of the noise sources, e.g. Cηt) = 〈η(ti)η(ti + Δt)〉. Note that mixed terms do not contribute since η and ξv are independent processes. Because of the presence of δ-correlated white noise it is convenient to distinguish between the calculation of the covariance between distinct intervals (k ≥ 1, here the white noise correlation function does not contribute) and the calculation of the variance (k = 0): (32) where β = exp(−T*/τη). Intuitively this distinction implies that the white noise source does increase the variance of the ISIs but does not introduce correlations among the intervals (at least, in the absence of adaptation). We note that the variance of the interspike interval in the case of purely white noise (Eq 32 with k = 0 and σ2 = 0) reduces to , which agrees with an expression derived by Ermentrout et al. [111] (see the unnumbered equation below their Eq 10).

From Eqs (32) and (23) the SCC can be calculated in terms of the correlation functions, yielding for a general PRC an expression in terms of double integrals that have to be evaluated numerically. However, following [47] we choose to express the SCC in the Fourier domain, using the Wiener-Khinchin theorem to substitute the correlation functions, and the finite Fourier transform of the PRC . Eq (32) then becomes (33) The SCC reads (as stated in the main part) (34) For D = 0 this expression agrees with the one derived by Schwalger et al. [47]. Remarkably, Eq (34) is the only place where the intensity of the white noise and the variance of the colored noise appear in our theory. Note that the limit D → 0, σ2 → 0 is well defined for a fixed ratio of noise intensities, D/(τησ2).

Derivation of the general correlation coefficient

To calculate the SCC for the full system we pursue a similar strategy as done by Ref. [39]. We find a relation between ISI deviations and perturbations and another relation between ISI deviations and adjacent peak adaptation values. These two relations allow i) to express ρk by the covariance of the peak adaptation values ck and ii) to find a stochastic map for the peak adaptation values from which this covariance can be calculated.

First ISI deviations can be calculated again via the PRC Z(τ) as described above. To this end we separate limit cycle dynamics from noise induced perturbations. In this case the perturbation (35) acting over one ISI Ti is found by rewriting Eq (2) as (36) with the deterministic peak adaptation value and the deviation from it δai−1 = a(ti−1) − a* [a(ti) taken right after the incrementation]. Here, the deterministic limit cycle is not only perturbed by a weak OU process η and Gaussian white noise ξv but also by a small deviation in the peak adaptation value δai−1. Combining Eqs (24) with (35) and shifting the index for notational convenience, we obtain (37) In contrast to the case considered in the previous section interval correlations can not be calculated from this equation directly because correlations among peak adaptation values 〈δai δai+k〉 are unknown. Moreover the peak adaptation value and, for example, colored noise are not independent of each other 〈δaiHi+1〉 ≠ 0.

Instead we derive a second expression for the interval deviations by relating adjacent peak adaptation values ai = a(ti), ai+1 = a(ti+1), that will generally deviate from the deterministic value, δai = aia* ≠ 0. Since the adaptation is spike triggered, its time course over one ISI is deterministic. The only non-deterministic quantity relating the two adjacent peak adaptation values is the length of the corresponding ISI Ti+1 (38) Deviations δTi and δai, δai+1 from their deterministic values can be related by linearizing Eq (38) (39) i.e. two adjacent peak adaptation values give us knowledge about the deviation of the interval in between. Inserting Eq (39) into (23 we find an expression for the SCC in terms of the covariances ck = 〈δaiδai+k〉 of the peak adaptation values: (40) with α = exp(−T*/τa) that contains the adaptation time scale. Next we determine the covariance from a stochastic map that is derived by combining Eqs (37) and (39) (41) where the linear response to adaptation is described by (42) while the linear responses to colored and white noise Hi, Ξi are defined as in the proceeding section [Eqs (27) and (28)] Note that ν is constant, which reflects the fact that the adaptation dynamics does neither include noise nor subthreshold adaptation. In contrast, Hi and Ξi are random numbers. Furthermore the stability of the stochastic map Eq (41) requires |αν| < 1.

Recursively applying the map to itself yields an relation not only between adjacent peak adaptation values, i.e. lag 1, but any lag k (43) The covariance is obtained by multiplying Eq (43) with δai and applying the average (44) Note that 〈δaiΞi+n−1〉 = 0 for n > 0 because Ξi only contains the uncorrelated white noise after the spike time ti to which the peak adaptation value δai belongs. The remaining terms of the sum can be simplified using the exponential decay of the averaged OU process (45) with β = exp(−T*/τη) and k ≥ 0. The resulting geometric sequence can be summed, yielding the covariance (46) The so obtained covariance for any lag k is determined by the variance , covariance 〈δaiHi〉 (at lag k = 0), and two prefactor that exclusively carry the dependence on k. To determine the SCC from ck according to Eq (40), we use the stochastic map to calculate the ratio (it turns out that this ratio is all we have to know). We obtain a first equation by squaring the map and averaging (47) for further use below we note that in the stationary case . A second equation is obtained by multiplying the map with Hi+1 and averaging (48) Here we used Eq (45) and the fact that terms containing Ξi drop out for the reason discussed above. Combining Eqs (47) and (48) yields the desired relation (49) The remaining terms , and 〈HiHi+1〉 are the known correlation functions of white and colored noise and can be calculated as done in the preceeding section.

We can now calculate the SCC by combining Eq (40) with 46, divide both numerator and denominator by 〈δaiHi〉, apply Eq (49) and find the result presented in the main part (50) with coefficients (51a) (51b) (51c) (51d) and specific correlation coefficients (52a) (52b) As easily verified, the main result Eq (50) is consistent in different limit cases. Switching off the colored noise (σ = 0 implying Sou(ω) ≡ 0) yields ρk = ρk,a. Similarly, setting Δ = 0 (vanishing adaptation) results in ν = 1 and ρk = ρk,η.

Coefficient of variation

Finally, we derive the coefficient of variation for the full system which follows a similar scheme as the derivation of the correlation coefficient presented above. Again, we use the assumption that the mean ISI can be approximated by the deterministic ISI. This allows us to express the CV in terms of the derivation δTi of the spike time and the deterministic period T* (53) Eq (39) can be used to relate the variance of the ISI to the variance and co-variance of the adaptation variable (54) The latter variance and co-variance can be traced back to the statistics of the uncorrelated and correlated Gaussian random numbers, Ξi and Hi, respectively. Specifically, we derive three relations from the stochastic map Eq (41) that are related to Eqs (47) and (46) at lag 1 and Eq (48) (55) (56) (57) Substituting these expressions into Eq (53) and dividing by T*2 yields the following expression: (58) We recall that Hi and Ξi are the integrated and PRC weighted colored and white noise sources and that correspondingly the terms above are found to be (59) (60) in strict analogy to the considerations in section Correlation coefficient for IF models without adaptation.

We stress again that our theory is based on the linear response function of a phase model, namely the phase-response curve, a description that holds true only in leading order of the perturbation amplitude (e.g. σ in the case of purely colored noise). Therefore, in this linear framework, deviations in the phase and in consequence in the interspike-interval caused by perturbations with amplitude σ can meaningfully be described only up to linear order in σ. This also implies that averages of products of intervals depend in leading order on σ2. Any extension into the realm of higher-order terms must be based on nonlinear response functions.

Details on the numerical integration of stochastic equations

For the numerical integration of Eq (2), a stochastic Euler scheme was used with step size Δt = 10−5. The simulation was terminated after 5 ⋅ 104 to 105 spikes; for all curves shown the standard error of the estimated mean of ρ1 was below 0.009.

To reduce transient effects, the initial conditions were chosen to be on the deterministic limit cycle, specifically v0 = vR, w0 = wR, a0 = a*, η0 = 0. The deterministic peak adaptation value a* can be calculated from the deterministic period via a* = [1 − exp(−T*/τa)]Δ/τa or, if T* is analytically inaccessible, directly from integrating the noiseless system. Of course, the specific initial values of the noise variable and adaptation variable in the stochastic system impose a transient (non-stationary) behavior in any measured statistics. However, due to the large number of subsequent ISIs, the transients have little effect on the considered statistics. Indeed, for selected parameter sets, we have verified that the initial values do not matter for the measured SCC.

Acknowledgments

We thank Tilo Schwalger for useful discussions on the topic of this manuscript.

References

  1. 1. Faisal AA, Selen LP, Wolpert DM. Noise in the nervous system. Nat Rev Neurosci. 2008;9(4):292–303. pmid:18319728
  2. 2. Burkitt AN. A Review of the Integrate-and-fire Neuron Model: I. Homogeneous Synaptic Input. Biol Cyber. 2006;95:1.
  3. 3. Burkitt AN. A review of the integrate-and-fire neuron model: II. Inhomogeneous synaptic input and network properties. Biol Cyber. 2006;95:97. pmid:16821035
  4. 4. Gerstner W, Kistler WM, Naud R, Paninski L. Neuronal Dynamics From single neurons to networks and models of cognition. Cambridge: Cambridge University Press; 2014.
  5. 5. Badel L, Lefort S, Brette R, Petersen CCH, Gerstner W, Richardson MJE. Dynamic I-V Curves Are Reliable Predictors of Naturalistic Pyramidal-Neuron Voltage Traces. J Neurophysiol. 2008;99:656. pmid:18057107
  6. 6. Jolivet R, Kobayashi R, Rauch A, Naud R, Shinomoto S, Gerstner W. A benchmark test for a quantitative assessment of simple neuron models. J Neurosci Meth. 2008;169:417. pmid:18160135
  7. 7. Teeter C, Iyern R, Menon V, Gouwens N, Feng D, Berg J, et al. Generalized leaky integrate-and-fire models classify multiple neuron types. Nat Commun. 2018;9:709. pmid:29459723
  8. 8. Izhikevich EM. Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Cambridge, London: The MIT Press; 2007.
  9. 9. Cox DR. Renewal Theory. London: Methuen; 1962.
  10. 10. Gerstner W. Time structure of the activity in neural network models. Phys Rev E. 1995;51:738. pmid:9962697
  11. 11. Perkel DH, Gerstein GL, Moore GP. Neuronal Spike Trains and Stochastic Point Processes. Biophys J. 2006;7:419.
  12. 12. Holden AV. Models of the Stochastic Activity of Neurones. Berlin: Springer-Verlag; 1976.
  13. 13. Tuckwell HC. Stochastic Processes in the Neuroscience. Philadelphia, Pennsylvania: SIAM; 1989.
  14. 14. Hagiwara S. Analysis of interval fluctuation of the sensory nerve impulse. Jpn J Physiol. 1954;14:234. pmid:13232885
  15. 15. Ratnam R, Nelson ME. Nonrenewal Statistics of Electrosensory Afferent Spike Trains: Implications for the Detection of Weak Sensory Signals. J Neurosci. 2000;20:6672. pmid:10964972
  16. 16. Chacron MJ, Longtin A, Maler L. Negative interspike interval correlations increase the neuronal capacity for encoding time-dependent stimuli. J Neurosci. 2001;21:5328. pmid:11438609
  17. 17. Neiman AB, Russell DF. Two Distinct Types of Noisy Oscillators in Electroreceptors of Paddlefish. J Neurophysiol. 2004;92:492. pmid:14573556
  18. 18. Nawrot MP, Boucsein C, Rodriguez-Molina V, Aertsen A, Grun S, Rotter S. Serial interval statistics of spontaneous activity in cortical neurons in vivo and in vitro. Neurocomp. 2007;70:1717.
  19. 19. Engel TA, Schimansky-Geier L, Herz AVM, Schreiber S, Erchova I. Subthreshold membrane-potential resonances shape spike-train patterns in the entorhinal cortex. J Neurophysiol. 2008;100:1576. pmid:18450582
  20. 20. Farkhooi F, Strube-Bloss MF, Nawrot MP. Serial correlation in neural spike trains: Experimental evidence, stochastic modeling, and single neuron variability. Phys Rev E. 2009;79:021905. pmid:19391776
  21. 21. Avila-Akerberg O, Chacron MJ. Nonrenewal spike train statistics: causes and consequences on neural coding. Exp Brain Res. 2011;210:353. pmid:21267548
  22. 22. Peterson AJ, Irvine D, Heil P. A model of synaptic vesicle-pool depletion and replenishment can account for the interspike interval distributions and nonrenewal properties of spontaneous spike trains of auditory-nerve fibers. J Neurosci. 2014;34(45):15097–15109. pmid:25378173
  23. 23. Song S, Lee JA, Kiselev I, Iyengar V, Trapani JG, Tania N. Mathematical modeling and analyses of interspike-intervals of spontaneous activity in afferent neurons of the zebrafish lateral line. Sci Rep. 2018;8(1):1–14. pmid:30291277
  24. 24. Chacron MJ, Lindner B, Longtin A. Noise shaping by interval correlations increases information transfer. Phys Rev Lett. 2004;93:059904. pmid:14995762
  25. 25. Lindner B, Chacron MJ, Longtin A. Integrate-and-fire neurons with threshold noise—A tractable model of how interspike interval correlations affect neuronal signal transmission. Phys Rev E. 2005;72:021911. pmid:16196608
  26. 26. Lindner B. Mechanisms of Information Filtering in Neural Systems. IEEE Trans Mol Biol Multi-Scale Commun. 2016;2:5.
  27. 27. Muscinelli P, Gerstner W, Schwalger T. How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS Comput Biol. 2019;15(6):e1007122. pmid:31181063
  28. 28. Tetzlaff T, Rotter S, Stark E, Abeles M, Aertsen A, Diesmann M. Dependence of neuronal correlations on filter characteristics and marginal spike train statistics. Neural Comput. 2008;20(9):2133–2184. pmid:18439140
  29. 29. Moreno-Bote R, Parga N. Response of Integrate-and-Fire Neurons to Noisy Inputs Filtered by Synapses with Arbitrary Timescales: Firing Rate and Correlations. Neural Comput. 2010;22:1528. pmid:20100073
  30. 30. Litwin-Kumar A, Doiron B. Slow dynamics and high variability in balanced cortical networks with clustered connections. Nat Neurosci. 2012;15:1498. pmid:23001062
  31. 31. Ostojic S. Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons. Nat Neurosci. 2014;17:594. pmid:24561997
  32. 32. Wieland S, Bernardi D, Schwalger T, Lindner B. Slow fluctuations in recurrent networks of spiking neurons. Phys Rev E. 2015;92:040901(R). pmid:26565154
  33. 33. Schwalger T, Fisch K, Benda J, Lindner B. How noisy adaptation of neurons shapes interspike interval histograms and correlations. PLoS Comp Biol. 2010;6:e1001026. pmid:21187900
  34. 34. Fisch K, Schwalger T, Lindner B, Herz A, Benda J. Channel noise from both slow adaptation currents and fast currents is required to explain spike-response variability in a sensory neuron. J Neurosci. 2012;32:17332. pmid:23197724
  35. 35. Benda J, Herz AVM. A universal model for spike-frequency adaptation. Neural Comput. 2003;15:2523. pmid:14577853
  36. 36. Benda J. Neural adaptation. Curr. 2021;31(3):R110–R116. pmid:33561404
  37. 37. Urdapilleta E. Onset of negative interspike interval correlations in adapting neurons. Phys Rev E. 2011;84:041904. pmid:22181172
  38. 38. Lindner B. Interspike interval statistics of neurons driven by colored noise. Phys Rev E. 2004;69:022901. pmid:14995506
  39. 39. Schwalger T, Lindner B. Patterns of interval correlations in neural oscillators with adaptation. Front Comp Neurosci. 2013;7:164. pmid:24348372
  40. 40. Brown E, Moehlis J, Holmes P. On the phase reduction and response dynamics of neural oscillator populations. Neural Comp. 2004;16:673. pmid:15025826
  41. 41. Gutkin B, Ermentrout GB, Reyes AD. Phase-response curves give the responses of neurons to transient inputs. J Neurophysiol. 2005;94(2):1623. pmid:15829595
  42. 42. Ermentrout GB, Terman DH. Mathematical foundations of neuroscience. vol. 35. Springer Science & Business Media; 2010.
  43. 43. Schiemann J, Schlaudraff F, Klose V, Bingmer M, Seino S, Magill PJ, et al. K-ATP channels in dopamine substantia nigra neurons control bursting and novelty-induced exploration. Nat Neurosci. 2012;15(9):1272–1280. pmid:22902720
  44. 44. Messer M, Costa KM, Roeper J, Schneider G. Multi-scale detection of rate changes in spike trains with weak dependencies. J Comp Neurosci. 2017;42(2):187–201. pmid:28025784
  45. 45. Vellmer S, Lindner B. Theory of spike-train power spectra for multidimensional integrate-and-fire neurons. Phys Rev Res. 2019;1(2):023024.
  46. 46. Ermentrout B. Type I membranes, phase resetting curves, and synchrony. Neural Comput. 1996;8:979. pmid:8697231
  47. 47. Schwalger T, Droste F, Lindner B. Statistical structure of neural spiking under non-Poissonian or other non-white stimulation. J Comput Neurosci. 2015;39:29. pmid:25936628
  48. 48. Middleton JW, Chacron MJ, Lindner B, Longtin A. Firing statistics of a neuron model driven by long-range correlated noise. Phys Rev E. 2003;68:021920. pmid:14525019
  49. 49. Shiau L, Schwalger T, Lindner B. Interspike interval correlation in a stochastic exponential integrate-and-fire model with subthreshold and spike-triggered adaptation. J Comput Neurosci. 2015;38:589. pmid:25894991
  50. 50. Lowen SB, Teich MC. Auditory-nerve action potentials form a nonrenewal point process over short as well as long time scales. J Acoust Soc Am. 1992;92:803. pmid:1324263
  51. 51. Chacron MJ, Lindner B, Maler L, Longtin A, Bastian J. Experimental and theoretical demonstration of noise shaping by interspike interval correlations. In: Stocks NG, Abbott D, Morse RP, editors. Fluctuations and Noise in Biological, Biophysical and Biomedical Systems III. vol. 5841. SPIE; 2005. p. 150.
  52. 52. Lewis C, Gebber G, Larsen P, Barman S. Long-term correlations in the spike trains of medullary sympathetic neurons. J Neurophysiol. 2001;85(4):1614–1622. pmid:11287485
  53. 53. Richardson MJE, Brunel N, Hakim V. From subthreshold to firing-rate resonance. J Neurophysiol. 2003;89:2538. pmid:12611957
  54. 54. Brunel N, Hakim V, Richardson MJE. Firing-rate resonance in a generalized integrate-and-fire neuron with subthreshold resonance. Phys Rev E. 2003;67:051916.
  55. 55. Liu YH, Wang XJ. Spike-frequency adaptation of a generalized leaky integrate-and-fire model neuron. J Comput Neurosci. 2001;10:25. pmid:11316338
  56. 56. Chacron MJ, Longtin A, St-Hilaire M, Maler L. Suprathreshold stochastic firing dynamics with memory in P-type electroreceptors. Phys Rev Lett. 2000;85:1576. pmid:10970558
  57. 57. Treves A. Mean-field analysis of neuronal spike dynamics. Network: Comput Neural Syst. 1993;4:259.
  58. 58. Softky WR, Koch C. The highly irregular firing of cortical cells is inconsistent with temporal integration of random EPSPs. J Neurosci. 1993;13:334. pmid:8423479
  59. 59. Pikovsky A, Kurths J. Coherence Resonance in a Noise-Driven Excitable System. Phys Rev Lett. 1997;78:775.
  60. 60. Mochizuki Y, Onaga T, Shimazaki H, Shimokawa T, Tsubo Y, Kimura R, et al. Similarity in Neuronal Firing Regimes across Mammalian Species. J Neurosci. 2016;36:5736. pmid:27225764
  61. 61. Rostami V, Rost T, Riehle A, van Albada SJ. Spiking neural network model of motor cortex with joint excitatory and inhibitory clusters reflects task uncertainty, reaction times, and variability dynamics. bioRxiv. 2020.
  62. 62. Dummer B, Wieland S, Lindner B. Self-consistent determination of the spike-train power spectrum in a neural network with sparse connectivity. Front Comp Neurosci. 2014;8:104.
  63. 63. Pena RF, Vellmer S, Bernardi D, Roque AC, Lindner B. Self-consistent scheme for spike-train power spectra in heterogeneous sparse networks. Front Comp Neurosci. 2018;12(9). pmid:29551968
  64. 64. Bair W, Koch C, Newsome W, Britten K. Power spectrum analysis of bursting cells in area MT in the behaving monkey. J Neurosci. 1994;14:2870. pmid:8182445
  65. 65. Lerchner A, Sterner GG, Hertz J, Ahmadi M. Mean field theory for a balanced hypercolumn model of orientation selectivity in primary visual cortex. Network: Comp Neural Sys. 2006;17:131. pmid:16818394
  66. 66. Vellmer S;. Personal communication.
  67. 67. Ermentrout B. Linearization of F-I curves by adaptation. Neural Comput. 1998;10:1721. pmid:9744894
  68. 68. Chacron MJ, Pakdaman K, Longtin A. Interspike Interval Correlations, Memory, Adaptation, and Refractoriness in a Leaky Integrate-and-Fire Model with Threshold Fatigue. Neural Comput. 2003;15:253. pmid:12590807
  69. 69. Benda J, Longtin A, Maler L. Spike-frequency adaptation separates transient communication signals from background oscillations. J Neurosci. 2005;25:2312. pmid:15745957
  70. 70. Benda J, Hennig RM. Spike-frequency adaptation generates intensity invariance in a primary auditory interneuron. J Comput Neurosci. 2008;24:113. pmid:17534706
  71. 71. Prescott SA, Sejnowski TJ. Spike-Rate Coding and Spike-Time Coding Are Affected Oppositely by Different Adaptation Mechanisms. J Neurosci. 2008;28:13649. pmid:19074038
  72. 72. Wimmer K, Hildebrandt KJ, Hennig RM, Obermayer K. Adaptation and Selective Information Transmission in the Cricket Auditory Neuron AN2. PLoS Comput Biol. 2008;4:e1000182. pmid:18818723
  73. 73. Peron S, Gabbiani F. Spike frequency adaptation mediates looming stimulus selectivity in a collision-detecting neuron. Nat Neurosci. 2009;12:318. pmid:19198607
  74. 74. Benda J, Maler L, Longtin A. Linear Versus Nonlinear Signal Transmission in Neuron Models With Adaptation Currents or Dynamic Thresholds. J Neurophysiol. 2010;104:2806. pmid:21045213
  75. 75. Farkhooi F, Muller E, Nawrot MP. Adaptation reduces variability of the neuronal population code. Phys Rev E. 2011;83:050905(R). pmid:21728481
  76. 76. Farkhooi F, Froese A, Muller E, Menzel R, Nawrot MP. Cellular Adaptation Facilitates Sparse and Reliable Coding in Sensory Pathways. PLoS Comput Biol. 2013;9:e1003251. pmid:24098101
  77. 77. Deemyad T, Kroeger J. Sub- and suprathreshold adaptation currents have opposite effects on frequency tuning. J Physiol. 2012;590:4839. pmid:22733663
  78. 78. Pozzorini C, Naud R, Mensi S, Gerstner W. Temporal whitening by power-law adaptation in neocortical neurons. Nat Neurosci. 2013;16:942. pmid:23749146
  79. 79. Ermentrout B, Pascal M, Gutkin B. The effects of spike frequency adaptation and negative feedback on the synchronization of neural oscillators. Neural Comput. 2001;13:1285. pmid:11387047
  80. 80. Fuhrmann G, Markram H, Tsodyks M. Spike Frequency Adaptation and Neocortical Rhythms. J Neurophysiol. 2002;88:761. pmid:12163528
  81. 81. Ladenbauer J, Augustin M, Shiau LJ, Obermayer K. Impact of Adaptation Currents on Synchronization of Coupled Exponential Integrate-and-Fire Neurons. PLoS Comput Biol. 2012;8:e1002478. pmid:22511861
  82. 82. Zhou P, Burton SD, Urban N, Ermentrout GB. Impact of neuronal heterogeneity on correlated colored-noise-induced synchronization. Front Comput Neurosci. 2013;7:113. pmid:23970864
  83. 83. Jolivet R, Schürmann F, Berger TK, Naud R, Gerstner W, Roth A. The quantitative single-neuron modeling competition. Biol Cybern. 2008;99:417. pmid:19011928
  84. 84. Badel L, Lefort S, Berger TK, Petersen CCH, Gerstner W, Richardson MJE. Extracting non-linear integrate-and-fire models from experimental data using dynamic I-V curves. Biol Cybern. 2008;99:361. pmid:19011924
  85. 85. Harrison PM, Badel L, Wall MJ, Richardson MJE. Experimentally Verified Parameter Sets for Modelling Heterogeneous Neocortical Pyramidal-Cell Populations. PLoS Comput Biol. 2015;11:8. pmid:26291316
  86. 86. Ladenbauer J, Augustin M, Obermayer K. How adaptation currents change threshold, gain, and variability of neuronal spiking. J Neurophysiol. 2014;111:939. pmid:24174646
  87. 87. Augustin M, Ladenbauer J, Baumann F, Obermayer K. Low-dimensional spike rate models derived from networks of adaptive integrate-and-fire neurons: comparison and implementation. PLoS Comput Biol. 2017;13:e1005545. pmid:28644841
  88. 88. Bauermeister C, Schwalger T, Russell D, Neiman AB, Lindner B. Characteristic Effects of Stochastic Oscillatory Forcing on Neural Firing: Analytical Theory and Comparison to Paddlefish Electroreceptor Data. PLoS Comput Biol. 2013;9:e1003170. pmid:23966844
  89. 89. Schwalger T, Schimansky-Geier L. Interspike interval statistics of a leaky integrate-and-fire neuron driven by Gaussian noise with large correlation times. Phys Rev E. 2008;77:031914. pmid:18517429
  90. 90. Urdapilleta E. Noise-induced interspike interval correlations and spike train regularization in spike-triggered adapting neurons. Europhys Lett. 2016;115(6):68002.
  91. 91. Schwalger T, Lindner B. Theory for serial correlations of interevent intervals. Eur Phys J Spec Topics. 2010;187:211.
  92. 92. Schwalger T, Tiana-Alsina J, Torrent MC, Garcia-Ojalvo J, Lindner B. Interspike-interval correlations induced by two-state switching in an excitable system. Epl-Europhys Lett. 2012;99:10004.
  93. 93. Cox DR, Lewis PAW. The Statistical Analysis of Series of Events. London: Chapman and Hall; 1966.
  94. 94. Blankenburg S, Lindner B. The effect of positive interspike interval correlations on neuronal information transmission. Math Biosci Eng. 2016;13:461. pmid:27106183
  95. 95. Braun W, Longtin A. Interspike interval correlations in networks of inhibitory integrate-and-fire neurons. Phys Rev E. 2019;99(3):032402. pmid:30999498
  96. 96. Touboul J, Brette R. Dynamics and bifurcations of the adaptive exponential integrate-and-fire model. Biol Cybern. 2008;99:319. pmid:19011921
  97. 97. Wilson D, Ermentrout B. Greater accuracy and broadened applicability of phase reduction using isostable coordinates. J Math Biol. 2018;76(1-2):37–66. pmid:28547210
  98. 98. Wilson D. Isostable reduction of oscillators with piecewise smooth dynamics and complex Floquet multipliers. Phys Rev E. 2019;99(2):022210. pmid:30934292
  99. 99. Schwabedal JTC, Pikovsky A. Effective phase dynamics of noise-induced oscillations in excitable systems. Phys Rev E. 2010;81:046218. pmid:20481818
  100. 100. Schwabedal J, Pikovsky A. Effective phase description of noise-perturbed and noise-induced oscillations. Euro PhysJ-Special Topics. 2010;187:63.
  101. 101. Schwabedal J, Pikovsky A. Phase Description of Stochastic Oscillations. Phys Rev Lett. 2013;110:204102. pmid:25167416
  102. 102. Cao A, Lindner B, Thomas PJ. A partial differential equation for the mean—first–return-time phase of planar stochastic oscillators. submitted. 2019.
  103. 103. Thomas PJ, Lindner B. Asymptotic Phase of Stochastic oscillators. Phys Rev Lett. 2014;113:254101. pmid:25554883
  104. 104. Thomas P, Lindner B. Phase descriptions of a multidimensional Ornstein-Uhlenbeck process. Phys Rev E. 2019;99(6):062221. pmid:31330649
  105. 105. Pikovsky A. Comment on “Asymptotic Phase for Stochastic Oscillators”. Phys Rev Lett. 2015;115:069401. pmid:26296133
  106. 106. Thomas PJ, Lindner B. Comment on “Asymptotic Phase for Stochastic Oscillators” Reply. Phys Rev Lett. 2015;115:069402.
  107. 107. Arecchi FT, Politi A. Transient Fluctuations in the Decay of an Unstable State. Phys Rev Lett. 1980;45:1219.
  108. 108. Lindner B, Longtin A, Bulsara A. Analytic expressions for rate and CV of a type I neuron driven by white Gaussian noise. Neural Comp. 2003;15:1761. pmid:14511512
  109. 109. Brunel N, Sergi S. Firing frequency of leaky integrate-and-fire neurons with synaptic current dynamics. J Theor Biol. 1998;195:87. pmid:9802952
  110. 110. Galán RF. Analytical calculation of the frequency shift in phase oscillators driven by colored noise: Implications for electrical engineering and neuroscience. Phys Rev E. 2009;80(3):036113. pmid:19905186
  111. 111. Ermentrout B, Beverlin B, Troyer T, Netoff TI. The variance of phase-resetting curves. J Comput Neurosci. 2011;31(2):185–197. pmid:21207126