^{1}

^{2}

^{3}

^{1}

The authors have declared that no competing interests exist.

In this paper we investigate the rate coding capabilities of neurons whose input signal are alterations of the base state of balanced inhibitory and excitatory synaptic currents. We consider different regimes of excitation-inhibition relationship and an established conductance-based leaky integrator model with adaptive threshold and parameter sets recreating biologically relevant spiking regimes. We find that given mean post-synaptic firing rate, counter-intuitively, increased ratio of inhibition to excitation generally leads to higher signal to noise ratio (SNR). On the other hand, the inhibitory input significantly reduces the dynamic coding range of the neuron. We quantify the joint effect of SNR and dynamic coding range by computing the metabolic efficiency—the maximal amount of information per one ATP molecule expended (in bits/ATP). Moreover, by calculating the metabolic efficiency we are able to predict the shapes of the post-synaptic firing rate histograms that may be tested on experimental data. Likewise, optimal stimulus input distributions are predicted, however, we show that the optimum can essentially be reached with a broad range of input distributions. Finally, we examine which parameters of the used neuronal model are the most important for the metabolically efficient information transfer.

Neurons communicate by firing action potentials, which can be considered as all-or-none events. The classical rate coding hypothesis states that neurons communicate the information about stimulus intensity by altering their firing frequency. Cortical neurons typically receive a signal from many different neurons, which, depending on the synapse type, either depolarize (excitatory input) or hyperpolarize (inhibitory input) the neural membrane. We use a neural model with excitatory and inhibitory synaptic conductances to reproduce in-vivo like activity and investigate how the intensity of presynaptic inhibitory activity affects the neuron’s ability to transmit information through rate code. We reach a counter-intuitive result that increase in inhibition improves the signal-to-noise ratio of the neural response, despite introducing additional noise to the input signal. On the other hand, inhibition also limits the neuronal output range. However, in the end, the actual amount of information transmitted (in bits per energy expended) is remarkably robust to the inhibition level present in the system. Our approach also yields predictions in the form of post-synaptic firing rate histograms, which can be compared with in-vivo recordings.

Cortical neurons receive input in the form of bombardment by action potentials (spikes) from other neurons and process and communicate the received information further by transmitting their own action potentials to other neurons. Individual action potentials do not differ in their time course and therefore, from the information processing point of view, they can be seen as all-or-none events. The response to a particular stimulus is therefore represented by a spike train—a sequence of times when an action potential was produced [

According to the efficient-coding hypothesis [

Given that the cortex has only a limited energy budget and information transfer is costly [

Typical approaches to information-theoretical analyses of single cells are either the use of the direct method [

Both the direct method and capacity analysis can be extended to account for the metabolic expenses. One of the earliest efforts to relate the information capacity to the metabolic expenses is that of Laughlin et at. [

In the presented work we utilize the MAT (Multi-timescale Adaptive Threshold) model [

(A) Stimulus consisting of excitatory and inhibitory synaptic conductances, generated as shot noise with an exponential envelope, is delivered to the neuronal model, a passive leaky membrane with a dynamic threshold. The measured response is the number of spikes in a specified time window (e.g., 250 ms). (B) For each stimulus intensity the full response distribution is obtained. The mean response (solid) and its standard deviation (shaded) are shown for illustration. (C) We find the probability distribution of inputs that maximizes the mutual information between the stimulus and the response per single spike. The predicted histogram of post-synaptic firing rates (PSFR) can be compared with experimental data.

The main contributions and the structure of this work can be summarized as follows:

By applying the results of Witsenhausen [

We qualitatively discuss the stimulus-response relationships and the capacity-cost functions and show the stabilizing effect of inhibition on the membrane potential fluctuations and discuss the implications for the given neuronal model.

We analyze the effect of inhibition on information-metabolic efficiency and more intuitive indicators of information transmission efficiency. We find that for a given mean post-synaptic firing rate, counter-intuitively, increased ratio of inhibition to excitation generally leads to higher signal to noise ratio (SNR). On the other hand, the inhibitory input significantly reduces the dynamic coding range of the neuron.

We present the predicted PSFR histograms and discuss the comparability with experimental data. In combination with the relative simplicity of fitting the parameters of the MAT model to real neurons, the presented framework allows us to predict the PSFR histograms for a wide variety of neurons. Furthermore, we observe that the shapes of the histograms depend only marginally the rate coding time scale.

We show the predicted optimal input distributions and point out to the robustness of metabolic efficiency and the PSFR histogram towards changes in the input distribution.

We explain the effect of model parameters on the obtained results and the significance of the spontaneous firing rate. We use parameter values fitted by Kobayashi et. al. [

The membrane potential of the MAT model is governed by the equation:
_{m} is the membrane time constant, _{L} = −80 mV the leakage potential, _{syn} is the synaptic current. Spikes are fired when the membrane potential reaches (or is above) the value of a dynamic threshold _{k} is the _{j} every time a spike occurs and then decays with the time constant _{j}. Absolute refractory period of 2 ms is introduced, during which the dynamics of the membrane potential and the threshold remain unchanged, but a spike cannot be fired. The parameters used to replicate the behavior of neurons from different classes (regular spiking—RS, intrisic bursting—IB, fast spiking—FS, chattering—CH) were identified by Kobayashi et al. [

The synaptic current is given by
_{exc}, _{inh} are the total conductances of the excitatory and inhibitory synapses and _{exc} = 0 mv, _{inh} = −75 mv are the respective synaptic reversal potentials. We consider the excitatory and inhibitory conductances to be
_{j}}, {_{k}} are generated by independent Poisson point processes with intensities λ_{exc}, λ_{inh} (to mimic the arrival of excitatory and inhibitory synapses), _{exc}, _{inh} are time constants of those synapses, which were chosen as 3 ms for the excitatory and 10 ms for the inhibitory synapses [_{exc} as the

To recreate biologically plausible conditions, we calculate the peak conductances and minimal intensities of Poisson processes _{exc} and _{inh} correspond to values reported in [

The _{exc}. In our work we compare the results for five different lengths of coding time windows: 100 ms, 200 ms, 300 ms, 400 ms and 500 ms.

The numerical integration procedure is described in

The metabolic cost of neuronal activity is determined mainly by the activity of the Na^{+}/K^{+} ionic pump in the neuronal membrane, pumping the excess Na^{+} out of the neuron. The main contributors to the overall cost are then: 1. reversal of Na^{+} entry at resting potential, 2. reversal of ion fluxes through post-synaptic receptors, 3. reversal of Na^{+} entry for action potentials and 4. additional costs associated with the action potential [

We follow the estimates from [_{rest} = 0.342 ⋅ 10^{9} ATP molecules per second, the cost of reversal of Na^{+} entry for action potentials at 0.384 ⋅ 10^{9} ATP molecules per single action potential and the costs associated with vesicle release due to action potential at 0.328 ⋅ 10^{9} ATP molecules, adding up to _{spike} ≈ 0.71 ⋅ 10^{9} ATPs/spike.

To calculate the cost needed to reverse the ion fluxes through post-synaptic receptors, we follow the approach used in [^{+} channels:
_{Na} = 90 mV, _{K} = −105 mV are the reversal potentials of Na^{+} and K^{+} channels. The current due to influx of Na^{+} ions is then
^{+} that have to be extruded. The ion pump uses one ATP molecule for 3 Na^{+} extruded.

Substituting _{Na}(_{exc}, λ_{inh}, we obtain the approximate formula for the cost of reversal of the synaptic currents:

The total cost of the signaling, given the input (λ_{exc}, λ_{inh}), is then:
_{exc}, λ_{inh}) is the average number of spikes observed for the given input.

In the framework of information theory, the input is a random variable _{exc}, which is a real number from an interval [_{p}:

By averaging the value of information over all possible outputs, we get the specific information (since

Given the input probability distribution _{p} is then
_{p} <

It follows from the Lagrangian theorem [_{max} for _{max} or at _{max} then expresses the amount of information per unit cost, which motivates the definition of

We will refer to a regime in which the neuron encodes the maximal possible amount of information per energy as to an

Since the response

Theoretical results show that the support of the optimal input distribution

The theory presented above holds for memoryless information channels without feedback, i.e., the response to the stimulus depends only on the current stimulus and not on any past stimuli or responses of the channel. However, real neurons exhibit adaptation to the stimulus, therefore the stimulus-response relationship

We evaluated the information transmission capabilities for different stimulation scenarios distinguished by the amount of inhibition associated with the stimulus. In each scenario, the frequency of excitatory synapses ranged from

From the stimulus-response relationships (

Stimulus-response relationships for the MAT neurons specified by the parameters in Table A in

Capacity-cost function (panel A) and capacity per spike (panel B) for the case of coding time window Δ = 100 ms and inhibition scaling factor

We observed that higher inhibition to excitation ratios leads to lower membrane potential fluctuations. This arises as an effect of synaptic filtering and reversal potentials, which are both biologically important parts of neural communication and essential for observation of this phenomenon (see

The decrease in the membrane potential’s standard deviation leads to a more reliable firing rate (response) and subsequently higher signat-to-noise ratio (SNR) in regimes with stronger inhibition (

(A) Signal-to-noise ratio (SNR,

The higher ratio of inhibition to excitation also has some negative consequences:

The inhibition limits the possible depolarization of the membrane and the neuron is unable to attain high firing rates. We quantify this by defining the coding range:

To attain identical mean firing rate with higher excitation to inhibition ratio, the excitatory synaptic current has to be larger and therefore such stimulation is associated with higher metabolic costs (

(A) Cost of response for a given input _{exc}, RS neuron (Tab A in

Surprisingly, the information theoretical efficiency is generally unaffected by the level of inhibition, meaning that the increase in signal to noise ratio and decrease of coding range effectively even out. This holds up to a certain point, when the coding range becomes too narrow and the efficiency of information transfer starts dropping dramatically (

By evaluating the information-metabolic efficiency we also obtain the optimal input-output statistics. The resulting optimal post-synaptic firing rate (PSFR) histograms (

Post-synaptic firing rate histograms corresponding to the metabolically efficient regime with the coding time window Δ = 500 ms and inhibition scaling factor

As we showed in the Methods section, the optimal input distribution has non-zero probability only for a finite number of points. However, the optimal conditions can be nearly reached by many different input distributions (

The plots show different input probability distributions obtained from different steps of the Jimbo-Kunisawa algorithm. For each input distribution the estimated efficiency ^{9} ATP) is given in the plot together with the relative error

Yet we can observe that in the metabolically efficient regime, significant portion of the probability is given to the weakest input, i.e., purely spontaneous activity. For a population of independently encoding neurons this would mean that at any moment, most of them would be exhibiting only spontaneous activity.

Naturally, longer time windows will lead to a higher signal to noise ratio (_{1}, _{2}} and outputs _{1}, _{2}}) is always lower or equal than double of the mutual information resulting from a single use [_{1}; _{2}) being maximal for extreme correlation between the inputs, i.e. _{1} = _{2}. Moreover, _{1}, _{2}}; _{1} + _{2}) < _{1}, _{1}}; {_{1}, _{2}}), since we are losing information about the temporal structure of the response. Therefore, given any probability distribution of the inputs, the mutual information for channel with a half-sized coding time window will always be higher (in bits per second):

(A) Signal to noise ratio for different coding time windows as a function of the mean response

In our case, the neurons are not truly memoryless channels. They exhibit adaptation, which we took into consideration in the optimization process by using an algorithm we developed specifically for this purpose (

We observe that while the length of time window doesn’t significantly influence the mean PSFR, the information capacity with the optimal mean PSFR drops and so does the associated efficiency in bits per spike (

In order to provide a meaningful comparison of different firing patterns, we have so far considered such parameters of the MAT model that lead to an approximately equal spontaneous firing rate (by spontaneous firing rate we mean the average response to the background noise, specified in

To calculate the spontaneous activity we take advantage of the approximate formula describing the stationary firing rate

In order to gain a general insight into the dependence of the predictions on the model parameters, we calculated the predicted mean PSFR (

The

We confirmed that _{2} and

The information capacity tells us what is the maximal amount of information a neuron could potentially reliably transfer. It is, however, beyond the scope of this work to investigate whether neurons utilize their full capacity and if so, how [

Analyses of this type generally have to rely on number of assumptions, including the nature of the input and the coding time scale. To mimic the nature of real neuronal synapses, we consider excitatory and inhibitory input with reversal potentials. The typical approach is to model the excitatory and inhibitory conductances as an Ornstein-Uhlenbeck process [

The MAT model is remotely related to the model analyzed by Suksompong et. al. [

If the investigated neuronal model exhibits adaptation to the stimulus (as e.g. the MAT model does), the coding time scale is typically significantly limited from below, so that the influence of previous stimuli on the current response is negligible. We try to overcome this issue by proposing an algorithm which partially takes into account the effect of the previous stimulus. This is an important part of the optimization process, because otherwise we could overestimate the benefits of inhibition on the information-metabolic efficiency (Fig A in

The comparison of different noise levels was inspired by the work [

Numerically, our results are consistent with, e.g., [^{9} ATP molecules expended. Despite the differences in spiking patterns among the neuronal classes (RS, IB, FS, CH), as quantified by local variability [

We considered both the excitatory and inhibitory rates (added on top of the modulatory background network activity) to scale linearly with the stimulus intensity, since this is the simplest scenario that can be considered. For most of the stimulation scenarios, we did not observe a significant change in the information-metabolic efficiency, however, if the inhibitory rates scaled slower than linearly, we could achieve both high signal to noise ratio and a wide coding range. Such regime is likely to employ very high rates of synaptic bombardment, therefore in such case one should also consider the cost of the pre-synaptic activity.

Our results deal with a single neuron, in accord with most of the previously published work [

To summarize the results of our work as follows:

By employing a novel method for calculating the information transmission capabilities of channels displaying adaptation to the stimulus (

We used the results of Richardson [

We found that the regular spiking (RS) neuron offers best information transmission per single spike, but when more energy is available, more information can be transmitted by the behavior common to fast spiking (FS) neurons. Neurons exhibiting the bursting behavior (IB, CH) were shown not to be very effective for rate coding in the investigated regimes.

Due to adaptation effects shorter rate coding time windows led to lower signal to noise ratios. Despite the increase in noise, information can be transferred more efficiently with shorter time windows. However, we observed that the length of the time window does not significantly affect the shape of the PSFR histograms, which have the potential to be compared to experimental data.

We found that the metabolic efficiency is surprisingly robust towards the changes in the amount of inhibition accompanying the excitation. Moreover, we observed that increased inhibition leads to higher signal to noise ratio, but also to a drop in the coding range. This does not affect the metabolic efficiency significantly until a certain point, when the coding range is so narrow that information cannot be transferred efficiently by rate code.

We pointed out that the optimal input for a neuron using rate code has non-zero probability only for a finite number of inputs. However, by showing different input distribution, which nearly achieve the information-metabolic efficiency, we illustrated that the discreteness of the input is not a necessary condition for an effective communication.

The core of the simulation code was written in C++ and packaged as a Python module using Cython. This module is available on GitHub (

(PDF)

(PDF)

(PDF)

(PDF)

(PDF)

(PDF)

Same as

(PDF)

We would like to thank Prof. Ryota Kobayashi for helpful discussion and for providing additional data.