^{1}

^{*}

^{2}

^{2}

^{1}

Conceived and designed the experiments: TS KF JB BL. Analyzed the data: TS KF. Wrote the paper: TS KF JB BL. Performed mathematical analysis: TS BL. Simulations of integrate-and-fire model: TS. Simulations of conductance-based model: KF.

The authors have declared that no competing interests exist.

Channel noise is the dominant intrinsic noise source of neurons causing variability in the timing of action potentials and interspike intervals (ISI). Slow adaptation currents are observed in many cells and strongly shape response properties of neurons. These currents are mediated by finite populations of ionic channels and may thus carry a substantial noise component. Here we study the effect of such adaptation noise on the ISI statistics of an integrate-and-fire model neuron by means of analytical techniques and extensive numerical simulations. We contrast this stochastic adaptation with the commonly studied case of a fast fluctuating current noise and a deterministic adaptation current (corresponding to an infinite population of adaptation channels). We derive analytical approximations for the ISI density and ISI serial correlation coefficient for both cases. For fast fluctuations and deterministic adaptation, the ISI density is well approximated by an inverse Gaussian (IG) and the ISI correlations are negative. In marked contrast, for stochastic adaptation, the density is more peaked and has a heavier tail than an IG density and the serial correlations are positive. A numerical study of the mixed case where both fast fluctuations and adaptation channel noise are present reveals a smooth transition between the analytically tractable limiting cases. Our conclusions are furthermore supported by numerical simulations of a biophysically more realistic Hodgkin-Huxley type model. Our results could be used to infer the dominant source of noise in neurons from their ISI statistics.

Neurons of sensory systems encode signals from the environment by sequences of electrical pulses — so-called spikes. This coding of information is fundamentally limited by the presence of intrinsic neural noise. One major noise source is “channel noise” that is generated by the random activity of various types of ion channels in the cell membrane. Slow adaptation currents can also be a source of channel noise. Adaptation currents profoundly shape the signal transmission properties of a neuron by emphasizing fast changes in the stimulus but adapting the spiking frequency to slow stimulus components. Here, we mathematically analyze the effects of both slow adaptation channel noise and fast channel noise on the statistics of spike times in adapting neuron models. Surprisingly, the two noise sources result in qualitatively different distributions and correlations of time intervals between spikes. Our findings add a novel aspect to the function of adaptation currents and can also be used to experimentally distinguish adaptation noise and fast channel noise on the basis of spike sequences.

The firing of action potentials of a neuron

Spike-frequency adaptation is another common feature of neural dynamics that, however, is still poorly understood in the context of stochastic spike generation. Associated adaptation currents which act on time scales ranging from

In previous studies on stochastic models with adaptation, fluctuations were considered to be fast, e.g. Poissonian synaptic spike trains passing through fast synapses or a white Gaussian input current representing a mixture of intrinsic fluctuations and background synaptic input. In particular, the dominating intrinsic source of fluctuations is ion channel noise

In this paper, we analyze a perfect integrate-and-fire (PIF) model in which a population of

Our main concern in this paper was the effect of noise associated with the slow dynamics of adaptation on the interspike interval statistics. Specifically, the noise was regarded as the result of the stochastic activity of a finite number of slow “adaptation channels”, e.g. M-type channels (

In order to demonstrate the results on two different levels of complexity, we conducted both analytical investigations of a tractable integrate-and-fire model and a simulation study of a biophysically more realistic Hodgkin-Huxley-type neuron model. For the first model we chose the perfect (non-leaky) integrate-and-fire (PIF) model

The PIF model, however, describes only the dynamics of the subthreshold voltage. In particular, it does not explicitely yield the suprathreshold time course of the action potential, but only the time instant of its onset (given by the threshold crossings). Nevertheless, the spike-induced activation of adaptation channels can be introduced in the PIF model. To this end, we approximated the activation function

Although our model aims at the stationary firing statistics, we would like to stress that it exhibits spike-frequency adaptation in the presence of time-varying stimuli. In particular, spike-frequency adaptation in response to a step stimulus is retained regardless of the considered noise source, channel numbers or approximations made during the theoretical analysis (

For the theoretical analysis the channel model describing the dynamics of each individual channel could be considerably simplified by a diffusion approximation. As shown in

Put differently, a finite population of slow adaptation channels (instead of an infinite population and hence a deterministic adaption dynamics) entails the presence of an additional noise

To study the effect of the two different kinds of noise, we focused on two limit cases: In the limit of infinitely many channels, the adapting PIF model is only driven by white noise. In this case, Eq. (1) reads

We call this case

In the opposite limit, we considered only the stochasticity of the adaptation current but not the white noise. Setting

We call this case (and the corresponding model based on individual adaptation channels)

The fraction of open channels

The calculation of the ISI statistics (histogram and serial correlations) of the PIF model with noise and spike-frequency adaptation is generally a hard theoretical problem. Here we put forward several novel approximations for the simple limit cases Eq. (2) and Eq. (3). For typical adaptation time constants that are much larger than the mean ISI we found the ISI histogram in the case of pure white noise (

In the opposite limit of only adaptation fluctuations (

As before, a spike is fired whenever

Interestingly, the scaling factor in Eq. (6) has a concrete meaning in terms of spike-frequency adaptation:

One central claim of this paper is that ISI histograms of neurons, for which slow channel noise dominates the ISI variability, cannot be described by an inverse Gaussian (IG) distribution in contrast to cases where fast fluctuations dominate. We recall that the IG distribution yields the ISI histogram for a PIF model driven by white noise without any (deterministic or stochastic) adaptation, so

By contrast, the ISI histogram in the case of stochastic adaptation as illustrated in

This suggests that both cases – deterministic and stochastic adaptation – might be distinguishable from the shape of the ISI histograms even if mean and CV of the ISIs are comparable as for the data in

The rescaled kurtosis reveals also differences between the channel and the diffusion model. In

In the case of stochastic adaptation, we obtained analytical expressions for

These expressions only depend on the non-dimensional parameter

These predictions are confirmed by simulations using different

For deterministic adaptation

In the intermediate range, where the time scale of the adaptation is of the same order as the mean ISI, a pronounced minimum of

Another clear distinction between stochastic and deterministic adaptation is revealed by the correlations between ISIs. In several modeling studies it has been found that negative feedback mechanisms like adaptation currents in the presence of white noise give rise to negative correlations between adjacent ISIs

Noting that

In striking contrast, the case of stochastic adaptation yields positive ISI correlations with a slow exponential decay (

ISI correlations are strongly governed by the time scales in the system. We therefore investigated the role of the time scale separation parameter

For stochastic adaptation, the positive correlations become strongest for

So far, we found that the two limit cases of the adapting PIF model can be well distinguished by the values of the shape parameters

For small channel numbers, i.e. large channel noise, we observed both large values of the rescaled kurtosis

For a fixed level of white noise (

We investigated whether our theoretical predictions based on a simple integrate-and-fire model are robust with respect to a more detailed model of the Hodgkin-Huxley type. To this end, we performed simulations of the conductance-based Traub-Miles model with a M-type adaptation current

The different ISI statistics for the case of deterministic and stochastic adaptation are analyzed more closely in

A clear distinction between both cases appears in the serial correlations of ISIs (

Finally, we inspected the case in which both white noise and slow adaptation noise is present (

The observation that key features of the ISI statistics in the presence of a stochastic adaptation current seem to be conserved across different models suggests a common mechanism underlying these features. As we saw, this mechanism is based upon the fact that a stochastic adaptation current can be effectively described by an

In this paper, we have studied how a noisy adaptation current shapes the ISI histogram and the correlations between ISIs. In particular, we have compared the case of pure stochastic adaptation with the case of a deterministic adaptation current and an additional white noise current. Using both a perfect IF model that is amenable to analytical calculations and a more detailed Hodgkin-Huxley type model, we found large differences in the ISI statistics depending on whether noise was mediated by the adaptation current or originated from other noise sources with fast dynamics. As regards the ISI histogram, stochasticity in the adaptation leads to pronounced peaks and a heavy tail compared to the case of deterministic adaptation, for which the ISI density is close to an inverse Gaussian. To quantify the shape of ISI histograms we proposed two measures that allow for a simple comparison with an inverse Gaussian probability density that has the same mean and variance. The first one is a rescaled skewness (involving the third ISI cumulant); the second is a rescaled kurtosis (involving the fourth ISI cumulant). Both quantities possess the property that they assume unity for an inverse Gaussian distribution. If they are larger than unity as in the case of stochastic adaptation the ISI density is more skewed or respectively has a sharper peak and a heavier tail than an inverse Gaussian density with the same variance. If these measures are smaller than one, the ISI histogram tends to be more Gaussian like. Most strikingly, we found that for a stochastic adaptation current the rescaled skewness and kurtosis strongly increase when the time scale separation of adaptation and spiking becomes large (

Another pronounced difference arises in the ISI correlations. For a deterministic adaptation current and a white noise driving one observes short-range anti-correlations between ISIs as reported previously (e.g.

Interestingly, the perfect integrate-and-fire model augmented with an adaptation mechanism predicted all the features seen in the spiking statistics of the Traub-Miles model with stochastic adaptation and/or white noise input. This indicates the generality and robustness of our findings. It also justified the use of the adapting PIF model as a minimal model for a repetitive firing neuron with spike-frequency adaptation. It seems, that in the suprathreshold regime the details of the spike generator are of minor importance compared to the influence of adaptation and slow noise.

By means of the PIF model one can theoretically understand the underlying mechanism leading to the large kurtosis and the positive ISI correlations in the case of stochastic adaptation. This rests upon the fact that slow adaptation noise effectively acts as an independent colored noise with a large correlation time. One can think of the colored noise as a slow external process that slowly modulates the instantaneous firing rate or, equivalently, slowly changes the ISIs in the sequence. Such a sequence of many short ISIs in a row and a few long ISIs gives rise to a large skewness and kurtosis and positive serial correlations. In previous works, slow processes which cause positive ISI correlations were often assumed to originate in the external stimulus

Spike-frequency adaptation has been ascribed to different mechanisms (see e.g.

Here, the Gaussian white noises

These equations can indeed be put into the form of Eq. (1) by splitting the deterministic and the noise part of

The adaptation currents

This only increases the mean adaptation to

The colored noise approximation can be carried out in an analogous manner yielding the same result Eq. (4) (again with

The analytical calculation of

For many neurons physiological details like the number of ion channels are hard to obtain directly from experiments. Instead given the spike train statistics of a neuron, our study could be useful to judge whether M-channels or other adaptation mechanisms could potentially contribute to the neuronal variability. Furthermore, it is not impossible to think of experiments, in which the number of adaptation channels is reduced (e.g. by the mild application of a channel blocker) and thus the effects of stochastic adaptation is affected in a controlled way. Another possibility to test our predictions would be to vary the firing rate of the neuron by increasing or decreasing the input current. In this way, the time scale of spiking would change relative to the time scale of adaptation and, thus, the colored noise effect of adaptation noise could be enhanced or attenuated, respectively.

Channel noise can crucially influence neural firing especially in the absence of synaptic input

Realistic numbers of M-type channels per neuron are difficult to estimate and the numbers used in this paper must be seen as a tuning parameter for the channel noise intensity. Channel densities of the M-type have been estimated to be of the order of one functional channel per

The diffusion approximation for the stochastic dynamics of ion channel populations (also known as Langevin or Gaussian approximation) has been studied by several authors

Spike-frequency adaptation has been commonly studied with regard to its mean effect on the firing rate

To analyze the slow, voltage-dependent adaptation channels in a simple setup we consider a population of

For

Here,

Therein,

In the conductance-based Traub-Miles model, we will use the kinetic scheme with the rate functions given by

Inserting this relation into Eq. (21), both rate functions show a quasi sigmoidal dependence on the voltage (see

For the integrate-and-fire (IF) model we employ a simplified variant that distinguishes only between two values of

This function and the set of spike times

In the definition of

Taking the open probability exactly as the fraction of open channels amounts to taking the limit of an infinite population of channels. In the Traub-Miles model or the integrate-and-fire model with deterministic adaptation, this corresponds to adding an adaptation current of the form

In this equation,

For a finite channel population, the fraction of open channels is given by the stochastic quantity

When the channel number is varied, we assume that the maximal conductance per unit membrane area

The channel model, Eq. (20), can be approximated by a single Langevin equation for

Furthermore, a separation of the adaptation into a deterministic and a stochastic part,

Note that Eq. (1), which will be our final diffusion model, also involves the additive-noise approximation presented below.

The perfect integrate-and-fire (PIF) model

The last term in Eq. (35) represents fast Gaussian input fluctuations of intensity

It is a feature of the PIF model that the firing rate is directly proportional to a constant driving current and independent of the noise. However, even for the slowly varying driving current

Solving for

The noise term in Eq. (31) or (34) is multiplicative, because both arguments of the noise strength

For the standard parameter set used in this paper,

It is useful to regard a non-dimensional form of Eq. (35). To this end, we measure henceforth voltages in units of

The dynamics of the PIF model is completely determined by the four non-dimensional parameters

In the last step, Eqs. (38) and (41) were used. The second term of

Again, the adaptation variable

In accordance with Eq. (37), the mean adaptation is proportional to the firing rate

In the case of deterministic adaptation (

In the case of stochastic adaptation (

Multiplication with

Finally, we perform the variable transformation

As can be seen from Eq. (53a), the stationary firing rate for stochastic adaptation is again given by Eq. (51).

If the stimulus depends on time, i.e. if

As an example, let us consider a step of the base current at time

This represents exactly the exponential decay of the firing rate in response to a step stimulus, which is typical for spike-frequency adaptation. Eq. (55) has been used in

The dynamical response to a step stimulus yields an interesting interpretation of the scaling factor

In the following,

The second cumulant equals the variance and is related to the coefficient of variation (CV), which is a measure of ISI variability:

We further consider the third cumulant, which is related to the skewness

Roughly speaking, the kurtosis indicates how much of the variability is due to events that strongly deviate from the mean value. For instance, a unimodal ISI density with a heavier tail compared to another ISI density with the same CV, tends to exhibit a larger kurtosis. This is typically accompanied by a more pronounced peak close to the mean value to balance the heavy tail.

In this paper, we want to compare the ISI density with an inverse Gaussian probability density serving as a reference statistic. For an inverse Gaussian ISI distribution (see below) one observes that the skewness is proportional to the CV and the kurtosis scales like the squared CV. This suggests to introduce rescaled versions of the skewness and the kurtosis as follows:

By defining the rescaled skewness and kurtosis in this manner, we obtain measures that are identical to unity for an inverse Gaussian ISI density. For larger (smaller) values, the ISI density is respectively more (less) skewed and more (less) peaked compared to an inverse Gaussian density. This procedure is somewhat analogous to the definition of the CV, for which the Poisson process serves as a reference for the ISI variability with

Furthermore, we consider the correlations among ISIs as quantified by the serial correlation coefficient

For the adapting PIF model, the two limit cases that are considered in this paper could be reduced to simplified models, for which analytical results are partly known. Firstly, in the case of deterministic adaptation, i.e.

This distribution has a mean

Furthermore, by construction we have

The mean adaptation approximation would wrongly predict that the ISIs are uncorrelated. The reason is that in the PIF model driven by only white Gaussian noise any memory of the ISI history is erased upon reset. A better account of ISI correlations is given below.

Secondly, the case of stochastic adaptation can be reduced to a PIF neuron driven by a reduced base current

For

with

Although this formula captures the ISI density at small ISIs, it is of limited use, because the second and higher ISI moments diverge. Throughout the paper we have therefore used the full expression Eq. (69). The mean ISI and the firing rate do not depend on the noise statistics, in fact they are equal to the white noise case, Eq. (65), as shown below (derivation of the ISI cumulants). The squared CV can be obtained to second order in

Here we derive an expression for the serial correlation coefficient of a PIF neuron with deterministic adaptation current and white noise driving. We consider the following subthreshold dynamics

If we set

The flow in the phase space spanned by

Solving for the limit cycle parameters

The serial correlation coefficient for the case of weak noise can be understood by considering small deviations from the limit cycle. Weak noise leads to a distribution of

Similarly, noise leads to a distribution of ISIs centered about

The last equation allows to express the interval correlations

Note, that

In order to calculate the correlations

Inserting Eq. (83) into Eq.(80) yields a stochastic map for

Multiplying Eq. (84) by

Linearizing the exponential

We mention, that this expression is consistent with the result in

Using Eq. (82) and Eq. (86) in the expression for the serial correlation coefficient

Following

For the description of the ISI density it is necessary to use the initial condition that corresponds to the distribution of

The ISI density is equal to the time-dependent probability flux across the threshold line

To carry out a weak-noise expansion of Eq. (91) we change to the variables

In these variables the Fokker-Planck equation for

Furthermore, we consider the characteristic function

In the last equation we introduced the function

From Eq. (97) and (99) it is clear, that all cumulants can be generated from the function

To solve Eq. (100) for weak noise, we expand

Inserting into Eq. (100) gives a hierarchy of first-order differential equations for the coefficients

The solutions can be obtained order-by-order using the method of characteristics. Here, we report only the first three coefficients

Higher-order coefficients can be calculated in the same way, although the expressions become quite lengthy. To obtain the kurtosis including the first noise-dependent term correctly, one has to calculate

From the cumulants one finds the rescaled skewness and kurtosis as defined in Eq. (61) and (62):

To test the generality of our findings, we simulated the conductance-based Traub

In this model, the adaptation time constant

For the second model with adaptation channel noise, the voltage is described by

The currents

As in the PIF model, we assumed the adaptation channels to be two-state ion channels with the transition rates

In the model with stochastic adaptation current, the maximal channel conductance