Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Dynamic Control of Synchronous Activity in Networks of Spiking Neurons

  • Axel Hutt,

    Affiliation Deutscher Wetterdienst, Section FE12 - Data Assimilation, 63067, Offenbach am Main, Germany

  • Andreas Mierau,

    Affiliation Institute of Movement and Neurosciences, German Sport University, Cologne, Germany

  • Jérémie Lefebvre

    jeremie.lefebvre@uhnresearch.ca

    Affiliations Krembil Research Institute, University Health Network, Toronto, Ontario, M5T 2S8, Canada, Department of Mathematics, University of Toronto, Toronto, Ontario, M5S 3G3, Canada

Dynamic Control of Synchronous Activity in Networks of Spiking Neurons

  • Axel Hutt, 
  • Andreas Mierau, 
  • Jérémie Lefebvre
PLOS
x

Abstract

Oscillatory brain activity is believed to play a central role in neural coding. Accumulating evidence shows that features of these oscillations are highly dynamic: power, frequency and phase fluctuate alongside changes in behavior and task demands. The role and mechanism supporting this variability is however poorly understood. We here analyze a network of recurrently connected spiking neurons with time delay displaying stable synchronous dynamics. Using mean-field and stability analyses, we investigate the influence of dynamic inputs on the frequency of firing rate oscillations. We show that afferent noise, mimicking inputs to the neurons, causes smoothing of the system’s response function, displacing equilibria and altering the stability of oscillatory states. Our analysis further shows that these noise-induced changes cause a shift of the peak frequency of synchronous oscillations that scales with input intensity, leading the network towards critical states. We lastly discuss the extension of these principles to periodic stimulation, in which externally applied driving signals can trigger analogous phenomena. Our results reveal one possible mechanism involved in shaping oscillatory activity in the brain and associated control principles.

Introduction

Brain signals are rife with oscillatory spectral patterns. These rhythmic features, uncovered through both intracranial and non-invasive recordings, have been shown to correlate strongly with cognitive processes, memory, and sensorimotor behavior [1, 2, 3], and are thus believed to be dynamic signatures of specific neural computations. As such, oscillatory activity is ubiquitous throughout the nervous system and represents the focus of an effervescent area of research [4, 5, 6].

Brain oscillations are however far from being static. Indeed, cortical rhythms are commonly subjected to sudden shifts induced by variations in behavior and cognitive states. Such spectral transitions are notably observed across normal sleep stages [7] or during the recruitment of attention [8]. The variability of oscillatory neural activity within the gamma band has been thoroughly studied and linked to changes in visual stimuli statistics and timely adjustments in local synaptic wiring [9, 10, 11]. However, shifts in brain oscillatory activity are also reliably observed at slower frequencies. Alpha oscillatory activity, in particular, has been found to be highly volatile [12]. It has been found that the iAPF accelerates during cognitive [13], memory [14] and sensorimotor [15] task performance as well as following a strenuous bout of physical exercise [16]. In addition, recent studies provide strong evidence that alpha oscillations are one candidate mechanism for gating the temporal window of sensory integration, and thus dictating the resolution of conscious sensory updating. Specifically, deliberate alterations of the iAPF within individual subjects, induced by transcranial alternating current stimulation, has been found to influence visual stimuli [17]. Consistent with this, individuals with higher iAPF have vision with finer temporal resolution, and within an individual, spontaneous fluctuations in iAPF predict visual perception [18]. Furthermore, in a temporal cueing task, forming predictions about when a stimulus will appear can instantaneously bias the phase of ongoing alpha-band oscillations toward an optimal phase for stimulus discrimination [19]. Taken together, these findings suggest that the magnitude of the iAPF is indicative of the level of arousal/attention, preparedness and performance of recruited cortical nets, in which “the faster the better”. Given that alpha activity operates on much broader spatial and temporal scales [20], spectral transitions observed within those frequency ranges likely relies on more global and distributed mechanisms.

To this day, the mechanism supporting these larger scale frequency transitions has been poorly understood. A key question is whether such transitions could be triggered by external stimulation. Recent studies have indeed shown that weak electric fields can perturb individual alpha oscillations and have a direct effect on visual stimulus perception [17,21] and task performance by reinforcing endogenous slow-wave rhythms [1,22].

To explore this question, we here investigate a non-linear network of spiking neurons with time delay, exhibiting alpha-like oscillatory activity. Our results reveal that despite the noisiness of the connectivity, stimuli implement an online gain-control mechanism where the peak frequency reflects the activation state of the neurons. We show that neural inputs, here modeled by noise, change the shape of the neuron response function, significantly changing the systems equilibria and stability. Using mean-field analysis of the network collective dynamics, we demonstrate that noise causes the system to shift from slow non-linear oscillations to fast linear oscillations, bringing the system towards a critical state. We also derive a frequency tuning curve that relates the network’s synchronous frequency to the noise intensity driving its constituent neurons. We lastly explore how these results also apply to periodic stimuli, opening new perspectives on how external brain stimulation can be used to control neural synchronous activity.

Model

In the present work, we analyze the dynamics of a generic network of spiking neurons (Fig 1A) whose membrane potential ui(t) evolves according to the following set of non-linear differential equations (1) where α is the membrane time constant, wij = [W]ij are synaptic weights and where τ is a mean conduction delay. The membrane potential ui(t) represents the deviation from the neurons resting potential that is present in the absence of synaptic and external input. The presynaptic spike trains obey the non-homogenous Poisson processes Xi → Poisson(f[ui]) with rate f and the Dirac distribution δ(t). The firing rate function f has a non-linear sigmoid shape and is defined by f[ui] = (1 + exp[−βui])−1,i.e. the maximum firing rate approaches f = 1 for large membrane potentials. All neurons are subjected to afferent pre-synaptic inputs. The synaptic connectivity scheme was randomly set (Fig 1B), such that (2) where g is the mean synaptic strength, s is the weight variance and nij are zero-mean independent Gaussian white noise such that < nijnkl >NxN = δikδjl with the Kronecker symbol δnm and where < >NxN is an average evaluated over all possible pairs of indices of the matrix W. We also consider other sources of synaptic inputs which we model as stochastic elements given the independent Gaussian white noise processes ξi with zero mean and < ξiξj >T = δij, where < >T is an average evaluated over an epoch of duration T or over an ensemble of realizations of processes. Such noise is meant to represent the effect of synaptic bombardment on neural membrane potentials. Such inputs can be shown to be well approximated by Gaussian processes in the diffusion limit case [23,24] and this is the approach we use in the following analysis. In the present work, we also consider a network of neurons whose spatial mean synaptic action is inhibitory with < 0.

thumbnail
Fig 1. Frequency transitions in a random network of spiking neurons.

A. Schematic illustration depicting some features of the network model, in which interconnected cells are driven by independent sources of noise. Individual cells are connected via excitatory (red) and inhibitory (blue) synaptic connections. B. Synaptic connectivity matrix. Weights are randomly distributed around a mean value g (See Eq 2). C. Sample network activity, in which neurons spike timing is modulated by global, slow-wave synchronous oscillations in both low (grey; D = 0.01) and high (blue; D = 0.50) input conditions. Faster and more irregular firing modulations characterize the high-input state. D. Power spectral density of the network mean activity in low (grey; D = 0.01) and high (blue; D = 0.50) input conditions. Other parameters are α = 100Hz, β = 300mV, g = −10mV/Hz, s = 20mV/Hz, τ = 25ms.

https://doi.org/10.1371/journal.pone.0161488.g001

Under the choice of a specific synaptic connectivity and other model parameters, the network spontaneously engages in synchronous activity at a baseline frequency of 10Hz while external input is absent. However, this frequency is found to be highly volatile in the presence of stochastic input: if the noise level is increased, the network spiking activity becomes more irregular and the synchronous frequency increases, cf. Fig 1C. The power spectrum of the network mean activity is plotted in Fig 1D, where peaks can be observed at the systems natural frequency and higher harmonics. As noise intensity is increased in the system, the peak frequency and associated harmonics gradually shift towards higher a frequency range.

Mean Field Representation for Stochastic Input

To better understand the impact of inputs on the dynamics, let us derive mean-field equations for our system. In the limit where the number of neurons is large, i.e. N, and the mean firing time scale 1/f is much smaller than the time scale of dendritic currents [25] we can use the response function f to approximate the rate at which recurrent pre-synaptic inputs perturb the activity of a neuron. To this end, one averages ui(t) over a very short time window and introduces a so-called coarsening in time [25, 26]. This well-established transformation allows to translate population spiking activity to population rate dynamics (3)

In the following, we use the same symbol for the original and the temporally coarse-grained membrane potential for notational simplicity. Let us further assume that emerging oscillations occur in a mean-driven regime in which the local dynamics can be seen as small independent fluctuations around the network mean activity i.e. (4) where the network mean activity is given by (5) and < >N is an average performed over the N unit of the network. In the following, we re-scale time by αtt for notational simplicity. As an ansatz, local fluctuations vi from the mean obey the Ornstein-Uhlenbeck processes (6)

Then, using Eq (4) above, and taking the mean over N neurons (7)

Then, as N [27], (8) where ρ is the probability density function of the solution of Eq (6), i.e. a zero-mean Gaussian distribution with variance var[v] = D. Moreover is the mean network connectivity. In Eq (8), we have used the fact E[XY] = E[X]E[Y] for the expectation value E of a product of two statistically independent random variables X and Y. This applies in Eq (8) since vj and wij are statistically independent.

Whenever the response function gain β is very large, [u] ≈ H[u], where H is the Heaviside step function with H[u] = 0 for all u < 0 and H[u] = 1,u ≥ 0. Then the right hand side of Eq (8) reads (9)

Combining previous results, the mean-field dynamics of our spiking network can be well approximated by the scalar non-linear delay-differential equation (10)

Eq (10) shows that pre-synaptic noise generically results in a linearization of the neurons response function [28,29] cf. Fig 2, bottom panels. This linearization shapes the input-output relationship of driven neurons, but also alters the features displayed by emergent activity patterns such as synchronous oscillations [30]. Fig 2 illustrates that increasing the external noise tunes the frequency of the network mean activity.

thumbnail
Fig 2. As noise increases, global oscillations accelerate and become gradually more linear.

Pre-synaptic noise generically results in a linearization of the neurons response function, altering the network stability and further shaping the frequency of ongoing oscillations. The network mean activity is shown (top panel) with a close up view of a few cycles (middle panel) with the associated response function (bottom panel), for various levels of noise. A. D = 0.001. B. D = 0.01. C. D = 0.1. Other parameters are identical to parameters used in Fig 1.

https://doi.org/10.1371/journal.pone.0161488.g002

Stochastic Stability Analysis

To investigate the frequency tuning observed in Fig 2, we take a closer look at the solutions of Eq (10). Synchronous activity in our network emerges as a consequence of an expected supercritical Hopf bifurcation commonly seen in recurrent delayed nets [31]. The single fixed point of result from Eq (10) by setting (11) where the last equation assumes small noise intensity D. Fig 3A shows that the mean-field fixed point decreases with increasing noise level.

thumbnail
Fig 3. Network stability and equilibrium are shaped by noise.

A. Fixed point of the system as per Eq (11) as a function of increasing noise intensity. Noise generically decreases the equilibrium, due to an increased recruitment of recurrent connections. B. Network susceptibility as a function of noise intensity. A gradual shift towards the critical susceptibility Rc occurs under the action of noise, causing the system to transit from slow non-linear oscillations to fast linear oscillations. C. System’s eigenvalues for moderate (D = 0.01) and strong (D = 0.1) noise levels. The eigenvalues gradually shift towards the left hand side of the imaginary plane. Critical eigenvalues (pairs of roots inside the black boxes) translate towards the imaginary axis (Re(λ) = 0) i.e. closer to the critical state. Other parameters are α = 100Hz, β = 2500/mV, g = −2mV/Hz, s = 4mV/Hz, τ = 25ms.

https://doi.org/10.1371/journal.pone.0161488.g003

We use the knowledge about the noise-dependent fixed point to better understand the network stability. The linearization of the mean-field Eq (10) about the fixed point from Eq (11) yields (12) with deviations w from the fixed point where is the network susceptibility. Stability of the network equilibrium is determined by setting with leading to the characteristic equation (13)

Solutions to Eq (13) define the spectrum of the linearized system in Eq (12). Collective oscillatory solutions form in the network if the susceptibility reaches a critical value Rc for λ = i ωc satisfying the complex relationship (14) where is the critical Hopf frequency i.e. the frequency of network synchronous oscillations close to the instability.

As seen in Fig 3B, in the absence of noise, the susceptibility is well below the critical value, and the network exhibits strong non-linear oscillations. This implies that R < Rc for D = 0 and the system evolves in a nonlinear limit cycle oscillation beyond a supercritical Hopf bifurcation while the fixed point is asymptotically unstable. As such, in the weak noise limit, the system is set robustly in the synchronous state.

However, upon the presence of noise, linearization of the neurons response function translates into a gradual increase of the susceptibility towards Rc: global synchronous oscillations in the network not only accelerate under the action of noise, but also becomes more linear. As such, noise brings the system closer to the bifurcation threshold and hence moves the network towards a critical asynchronous state. The effect of noise on the characteristic eigenvalue spectrum is plotted in Fig 3C. As noise intensity increases, the eigenvalues undergo a gradual shift towards the left hand side of the complex plane, bringing pairs of eigenvalues closer to the imaginary axis and thus closer to the bifurcation threshold. This occurs because the system’s susceptibility approaches the critical susceptibility under the effect of noise. As such, afferent inputs engage the network and drive it towards the asynchronous state and the transition trajectory in parameter space is characterized by an gradual increase in the network peak frequency.

Stochastic Frequency Tuning

Finding explicit frequency relationships in the fully non-linear regime is challenging. Yet, to approximate the dependence of the network frequency on the noise driving the neurons, one might use the Galerkin method [32, 33]. Whenever f[u] ≈ H[u] holds for sufficiently large values of the gain β, this approach seeks to find a frequency ω minimizing the measure (15) where . Using the ansatz while expanding F to third order about the steady state , one obtains for the first iteration, (16)

For J = 0, one can solve Eq (16) for ω > 0 to obtain a first order approximation for the stochastic frequency (17)

Expanding to third order for weak noise, i.e. D ≈ 0 (18) leading to the frequency tuning curve (19) where Δ(D) is a noise-induced shift with Δ(D) > 0. Fig 4 shows the frequency with respect to the noise intensity. Together with the results above, while approximate, demonstrate that under the action of noise, the network oscillation frequency gradually shifts from a non-linear baseline frequency towards the critical frequency . Eqs (18) and (19) thus imply that as noise increases in the system, the network transits from slow non-linear rhythms to fast linear oscillations. This can also be understood by looking at the susceptibility, which gauges the relative influence of recurrent interactions in the network, plotted in Fig 3B. As susceptibility decreases under the action of noise, the system accelerates and shifts from a recurrent deeply synchronous state to an asynchronous input driven regime.

thumbnail
Fig 4. Frequency tuning curve.

Frequency of the network synchronous oscillations as a function of noise intensity. Noise causes the peak frequency of the network oscillations to shift from the baseline frequency ωo towards the critical frequency ωc. The peak frequency is plotted according to numerical simulations of the network dynamics (red dotted curve), the mean-field approximation (grey; as per Eq 10) and using the frequency tuning curve (black; as per Eq 17). Other parameters are taken from Fig 3.

https://doi.org/10.1371/journal.pone.0161488.g004

Mean Field Analysis for Periodic Stimulation

We have seen that noise shapes the stability and oscillatory features of non-linear networks by linearizing the neurons response function. However, this feature is not exclusive to noisy inputs. Indeed, numerous studies have shown that, in addition to changes in brain state, ongoing oscillatory activity can be modulated by noninvasive stimulation, something that is increasingly capitalized upon in basic research and clinical practice [34, 35, 36, 37, 38, 39]. Recent results have further shown that exogenous electric rhythmic stimulation (i.e. periodic forcing), in addition to resonance and entrainment, can also provoke non-linear acceleration of endogenous oscillations and shift the baseline frequency of driven neural systems [40], causing resonance curves and Arnold tongues to bend in stimulation parameter space ([40] cf Fig 4). To explore the mechanism behind this non-linear effect, we here revisit the analysis performed in the stochastic case and adapt it to the presence of periodic forcing.

In the presence of global periodic stimulation with frequency fs and amplitude I0, the network dynamics obeys (20)

First, we consider activity course-grained in time with (21) similar to the stochastic case. Then we assume two temporal scales in the evolution of the potentials with uj = mj + vj: the slow mode mj and the fast mode vj. Inserting this relation into Eq (20), we obtain (22) and (23)

Since the experimental observation reflects the average activity in the neural ensemble, we consider spatially homogeneous slow activity . Then averaging over the time interval of one short stimulus cycle, one can express the spatio-temporal mean dynamics in the adiabatic regime for fast stimuli (24)

Here <·>T,N denotes a spatial average and a time average taken over a time interval T small compared to the dynamics of the network i.e. for 1/fsT ≪ 2π/ωo, i.e. (25) for the local activity ui(t). Again, this implies that the driving frequency fs is large compared to the systems self-sustained oscillation frequency. By virtue of this time scale separation, it is reasonable to assume stationarity in the time interval of duration T. The solutions of Eq (22) for large frequencies and large times t obey (26) i.e. the fluctuations about the spatial mean synchronize and converge to a single solution. Then (27) with the probability density (28) with μ = I0/2πfs. Here, we have used the time scale separation between the slow evolution of and the fast synchronous fluctuations. Eq (28) has the same form as Eq (8) in the case of stochastic stimulation. Again, similar to the stochastic case, the external stimulation leads to a convolution of the nonlinear response function f with the probability density function of the stimulus-induced fluctuations about the spatial mean. We note that, in contrast to the stochastic case, the notion of a probability density function in the present deterministic system may appear counter-intuitive. This interpretation however is reasonable since ρ(v) is proportional to the residence time of the oscillation vi(t) at amplitude v [41] that motivates the interpretation of a probability density.

Now assuming [u] ≈ H[u], Eq (28) reads (29) and for and for . Combining previous results, the dynamics in presence of periodic forcing becomes (30)

Stability and Frequency Tuning

Taking a closer look at the slope of the new response function , we find (31) for , and otherwise. Hence the nonlinear response function flattens and becomes increasingly linear for increasing μ similar to the stochastic case for increased noise level. Fig 5 (bottom panels) shows for three different stimulus amplitudes, i.e. different values of , confirming this analytical finding.

thumbnail
Fig 5. As period driving amplitude increases, global oscillations accelerate and become gradually more linear.

The network mean activity is shown (top panels) with a close up view of a few cycles (center panels) with the associated response function (bottom panels), for various input amplitudes. A. Io = 0.01. B. Io = 0.1. C. Io = 1.0. Other parameters are α = 100Hz, β = 2500/mV, g = −2mV/Hz, s = 4mV/Hz, τ = 25ms.

https://doi.org/10.1371/journal.pone.0161488.g005

To gain insight into the dynamics of , we consider the fixed point and small deviations about it. Similar to Eq (12), the corresponding characteristic roots are defined by with the susceptibility . Since , similar to the stochastic case fast external periodic driving moves the system from a nonlinear regime far from the bifurcation threshold to a linear regime close to to the stability threshold, cf. Fig 3C.

Using the Galerkin approach as in Eq (15), one obtains an analogous expression to Eq (17) for the oscillation frequency but for the periodic forcing case (32) for small periodic driving amplitudes I0 and with the fixed point for small μ. Then after few calculus steps one finds (33) for small values of μ. Hence linearising the response function by fast external stimulation increases the oscillation frequency of the system. Fig 5 shows numerical simulations of Eq (30) for three different driving amplitudes. The oscillation frequency increases with increasing driving amplitude in accordance to the analytical result in Eq (28). This finding resembles the stochastic case for increasing noise level.

To summarize the results above, periodic forcing not only interacts with the dynamics of the network by resonance or entrainment, but also shapes its activity via non-linear effects. This thus implies that the use of weak, high frequency stimulation can mediate frequency transitions in a similar fashion as stochastic inputs.

Discussion

Previous experimental and theoretical work has shown that gamma-like oscillatory activity is highly dynamic, changing according to stimulus intensity [42]11), spatial features [43,44] and is strongly correlated to the phase of slower frequencies [45,46]. Such gamma oscillations have been shown to build on highly local circuits shaping the timing of interactions between excitatory pyramidal cells and inhibitory interneurons [11]. Slower frequencies, such as alpha activity, have also been shown to be variable. Shifts in the peak alpha frequency have been reliably reported during changes in attentional states [13, 18, 19], during sensorimotor task performance [15], and following intense physical exercise [16].

Alpha oscillations have been shown to engage more spatially extended connections[47], in which propagation delays play an important role [20], suggesting that the mechanisms involved in shaping the peak alpha frequency is different from the one involved for faster, more local frequencies such as gamma for example. Most of the research on synchronous neural dynamics has been devoted to the study of input-induced transitions in- and out of synchronous states, where network firing rate oscillations emerge in presence of strongly correlated drive [30,48,49]. In contrast, we here characterize a smooth approach towards a delayed-induced bifurcation in which the equilibria, stability and peak frequency are impacted by noise or external periodic driving. Using mean-field approaches, we have detailed the stability of the network oscillatory states and derived a frequency tuning curve that relates the frequency of synchronous oscillations to the intensity of the input driving the neurons.

While we have considered constant values of the input intensity in our analysis, results extend to time-varying inputs as well. For instance, the level of noise D(t) driving neurons would vary according to particular sets of stimuli and\or tasks. In such a case, fluctuations in the noise intensity would be mirrored by concomitant changes in the network peak frequency. What are implications of this dynamic behavior with respect to neural coding? Similarly, experimental setups involving electric periodic stimulation, such as in deep brain stimulation [50,51,52], electric stimulation [53,54].or visual stimulation [55], will induce a shift of frequency [40].

According to the framework we detailed, shifts in the magnitude of inputs to the neurons translate into changes in synchronous oscillations frequencies. Network mean output activity can thus be described by the heuristic relationship (34) where fo is the baseline frequency, Δ is an input-dependent frequency shift, A is an amplitude parameter (which may depend on input parameters) and ϕ is a random phase. From this perspective, the frequency tuning mechanism can be seen as implementing a frequency modulation (FM) communication scheme coding for input intensity. This hypothesis has been considered on numerous occasions [56,30], and lately gained momentum alongside the development of the “Communication Through Coherence” (CTC) hypothesis in which neural assemblies engage into long-distance communication based on the coherence between oscillatory states [57]. Hence, we hypothesize that the stochastic frequency tuning explored here could potentially implement a gating mechanism allowing the routing of information towards different distal networks, and thus represent an aspect of large-scale neural coding.

Author Contributions

  1. Conceptualization: AH AM JL.
  2. Data curation: AH AM JL.
  3. Formal analysis: AH AM JL.
  4. Funding acquisition: AH AM JL.
  5. Investigation: AH AM JL.
  6. Methodology: AH AM JL.
  7. Project administration: AH AM JL.
  8. Resources: AH AM JL.
  9. Software: AH AM JL.
  10. Supervision: AH AM JL.
  11. Validation: AH AM JL.
  12. Visualization: AH AM JL.
  13. Writing – original draft: AH AM JL.
  14. Writing – review & editing: AH AM JL.

References

  1. 1. Klimesch W (1999) EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis Brain Res Rev. 29(2–3):169–95. pmid:10209231
  2. 2. Pfurtscheller G, Lopes da Silva FH (1999) Event-related EEG/MEG synchronization and desynchronization: basic principles. Clin Neurophysiol 110(11):1842–57. pmid:10576479
  3. 3. Mierau A, Felsch M, Hulsdunker T, Mierau J, Bullermann P, Weiss B, Struder HK. (2016): The interrelation between sensorimotor abilities, cognitive performance and individual EEG alpha peak frequency in young children. In: Clinical neurophysiology: official journal of the International Federation of Clinical Neurophysiology 127 270–276.
  4. 4. Wang XJ, Buzsáki G (1996) Gamma oscillation by synaptic inhibition in a hippocampal interneuronal network model. J Neurosci 16(20):6402–6413. pmid:8815919
  5. 5. Engel AK, Singer W. (2001) Temporal binding and the neural correlates of sensory awareness. Trends Cogn Sci 5:16–25. pmid:11164732
  6. 6. Engel AK, Fries P, Singer W. (2001) Dynamic predictions: oscillations and synchrony in top-down processing. Nat Rev Neurosci 2:704–16. pmid:11584308
  7. 7. Steriade M, Timofeev I, Grenier F. (2001) Natural waking and sleep states: a view from inside neocortical neurons. J Neurophysiol. 85(5):1969–85. pmid:11353014
  8. 8. Klimesch W. (2012): alpha-band oscillations, attention, and controlled access to stored information. In: Trends in cognitive sciences 16 (12), S. 606–617. pmid:23141428
  9. 9. Whittington MA, Traub RD, Jefferys JG (1995) Synchronized oscillations in interneuron networks driven by metabotropic glutamate receptor activation. Nature 373:612–615. pmid:7854418
  10. 10. Ray S, Maunsell JH (2010) Differences in gamma frequencies across visual cortex restrict their possible use in computation. Neuron 67:885–96. pmid:20826318
  11. 11. Jadi MP, Sejnowski TJ (2014) Cortical oscillations arise from contextual interactions that regulate sparse coding. Proc Natl Acad Sci U S A. 111:6780–5. pmid:24742427
  12. 12. Chiang AK, Rennie CJ, Robinson PA, van Albada SJ, Kerr CC. (2011) Age trends and sex differences of alpha rhythms including split alpha peaks. Clin Neurophysiol.122(8):1505–17. pmid:21349761
  13. 13. Haegens S, Cousijn H, Wallis G, Harrison PJ, Nobre AC (2014) Inter- and intra-individual variability in alpha peak frequency. NeuroImage 92:46–55. pmid:24508648
  14. 14. Klimesch W, Schimke H, Pfurtscheller G (1993) Alpha frequency, cognitive load and memory performance. Brain topography 5, 241–251. pmid:8507550
  15. 15. Hülsdünker T, Mierau A, Strüder HK (2016) Higher Balance Task Demands are Associated with an Increase in Individual Alpha Peak Frequency. Front. Hum. Neurosci. 9: 00695.
  16. 16. Gutmann B, Mierau A, Hülsdünker T, Hildebrand C, Przyklenk A, Hollmann W, Strüder HK (2015) Effects of Physical Exercise on Individual Resting State EEG Alpha Peak Frequency. Neural plasticity 2015, 717312. pmid:25759762
  17. 17. Cecere R, Rees G, Romei V (2015) Individual differences in alpha frequency drive crossmodal illusory perception. Curr Biol 25:231–235. pmid:25544613
  18. 18. Samaha J, Postle BR (2015) The Speed of Alpha-Band Oscillations Predicts the Temporal Resolution of Visual Perception. Curr. Biol. 25: 1–6.
  19. 19. Samaha J, Bauer P, Cimaroli S, Postle BR (2015) Top-down control of the phase of alpha-band oscillations as a mechanism for temporal prediction. PNAS 112(27):8439–8444. pmid:26100913
  20. 20. Cabral J, Luckhoo H, Woolrich M, Joensson M, Mohseni H, Baker A, Kringelbach M, Deco G. (2012) Exploring mechanisms of spontaneous functional connectivity in MEG: How delayed network interactions lead to structured amplitude envelopes of band-pass filtered oscillations, NeuroImage 90: 423–435.
  21. 21. Chanes L, Quentin R, Tallon-Baudry C, Valero-Cabre A (2013) Causal frequency-specific contributions of frontal spatiotemporal patterns induced by non-invasive neurostimulation to human visual performance. J Neurosci 33:5000–5005. pmid:23486970
  22. 22. Henry MJ, Herrmann B, Obleser J (2014) Entrained neural oscillations in multiple frequency bands comodulate behavior. Proc Natl Acad Sci USA 111(41):14935–14940. pmid:25267634
  23. 23. Capocelli RM, Ricciardi LM (1971) Diffusion approximation and first passage-time problem for a neuron model. Kybernetik 8, 214. pmid:5090384
  24. 24. Lansky (1984) On approximations of Stein's neuronal model. J. theo. Biol. 107, 631
  25. 25. Hutt A, Buhry L (2014): Study of GABAergic extra-synaptic tonic inhibition in single neurons and neural populations by traversing neural scales: application to propofol-induced anaesthesia. J. Comput. Neurosci. 37 (3): 417–437 pmid:24976146
  26. 26. Ermentrout G Bard (1998). Neural networks as spatio-temporal pattern-forming systems. Rep. Progr. Phys. 61: 353–430.
  27. 27. Shiino M (1987) Dynamical behavior of stochastic systems of infinitely many coupled nonlinear oscillators exhibiting phase transitions of meanfield type: H theorem on asymptotic approach to equilibrium and critical slowing down of order-parameter fluctuations. Phys Rev A 36:2393–2412.
  28. 28. Milton J (1996). Dynamics of small neural populations. American Mathematical Society, Providence, Rhode Island.
  29. 29. Sutherland C, Doiron B, Longtin A. (2009) Feedback-induced gain control in stochastic spiking networks. Biol. Cybern 100:475–489. pmid:19259695
  30. 30. Lefebvre J, Hutt A, Knebel JF, Whittingstall K, Murray MM (2015) Stimulus Statistics Shape Oscillations in Nonlinear recurrent neural networks, J Neurosci 35(7): 2895–2903. pmid:25698729
  31. 31. Lefebvre J, Longtin A, LeBlanc VG (2009). Dynamics of driven recurrent networks of ON and OFF cells. Physical Review E, 80, 041912.
  32. 32. He JH (2005) Periodic solutions and bifurcations of delay-differential equations. Physics Letters A 347:228–230.
  33. 33. Liu HM (2005) Approximate period of nonlinear oscillators with discontinuities by modified Lindstedt-Poincare method. Chaos, Solitons and Fractals 23: 577–579.
  34. 34. Paulus W (2011) Transcranial electrical stimulation (tES—tDCS; tRNS, tACS) methods. Neuropsychol Rehabil 21:602–17. pmid:21819181
  35. 35. Romei V, Driver J, Schyns PG, Thut G (2011) Rhythmic TMS over Parietal Cortex Links Distinct Brain Frequencies to Global versus Local Visual Processing, Current Biology 21(4):334–337 pmid:21315592
  36. 36. Dayan E, Censor N, Buch ER, Sandrini M, Cohen LG (2013) Noninvasive brain stimulation: from physiology to network dynamics and back, Nat Neurosci. 16(7):838–44. pmid:23799477
  37. 37. Neuling T, Rach S, Herrmann CS (2013). Orchestrating neuronal networks: sustained after-effects of transcranial alternating current stimulation depend upon brain states. Frontiers in Human Neuroscience, 7:161. pmid:23641206
  38. 38. Notbohm A, Kurths J, Herrmann CS (2016) Modification of Brain Oscillations via Rhythmic Light Stimulation Provides Evidence for Entrainment but Not for Superposition of Event-related Responses. Frontiers in Human Neurosciences (In Press).
  39. 39. Helfrich RF, Schneider TR, Rach S, Trautmann-Lengsfeld SA, Engel AK, Herrmann CS (2014) Entrainment of brain oscillations by transcranial alternating current stimulation. Curr Biol 24(3):333–339. pmid:24461998
  40. 40. Herrmann CS, Murray MM, Ionta S, Hutt A, Lefebvre J (2016) Shaping Intrinsic Neural Oscillation with Periodic Stimulation. The Journal of Neuroscience 36(19): 5328–5337. pmid:27170129
  41. 41. Baker GL (2006) Probability, pendulums, and pedagogy, Am. J. Phys. 74(6): 482–489.
  42. 42. Brunel N (2000) Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. J Comput Neurosci 8:183–208. pmid:10809012
  43. 43. Jia X, Smith MA, Kohn A (2011) Stimulus Selectivity and Spatial Coherence of Gamma Components of the Local Field Potential. The Journal of Neuroscience 31(25):9390–9403. pmid:21697389
  44. 44. Hermes D, Miller KJ, Wandell BA, Winawer J (2014) Stimulus Dependence of Gamma Oscillations in Human Visual Cortex. Cerebral Cortex.25(9):2951–9 pmid:24855114
  45. 45. Lisman JE, Jensen O (2013) The Theta-Gamma Neural Code. 77(6): 1002–1016.
  46. 46. Cohen MX. (2014) Fluctuations in oscillation frequency control spike timing and coordinate neural networks. J Neurosci. 34:8988–98. pmid:24990919
  47. 47. Hindriks R, van Putten MJ, Deco G (2014) Intra-cortical propagation of EEG alpha oscillations. Neuroimage 103:444–453. pmid:25168275
  48. 48. Doiron B, Lindner A, Longtin L, Maler J, Bastian , Phys. Rev. Lett. 93:048101.
  49. 49. Lindner B, Doiron B, Longtin A (2005) Theory of oscillatory firing induced by spatially correlated noise and delayed inhibitory feedback. Phys Rev E 72:061919.
  50. 50. Perlmutter JS, Mink J (2006) Deep Brain Stimulation. Annu. Rev. Neurosci 29:229–257. pmid:16776585
  51. 51. Hammond C, Bergman H, Brown P (2007) Pathological synchronization in Parkinson’s disease: networks, models and treatments. Trends in Neuroscience 30:357–364.
  52. 52. Little S, Brown P (2012) What brain signals are suitable for feedback control of deep brain stimulation in Parkinson’s disease? Ann. N.Y. Sci. 1265:9–24.
  53. 53. Ali MM, Sellers KK, Fröhlich F (2013) Transcranial alternating current stimulation modulates large-scale cortical network activity by network resonance. J Neurosci 33:11262–75. pmid:23825429
  54. 54. Fröhlich F (2015) Experiments and models of cortical oscillations as a target for non-invasive brain stimulation. Progress in Brain Research 222: 41–73. pmid:26541376
  55. 55. Herrmann CS (2001) Human EEG responses to 1–100 Hz flicker: resonance phenomena in visual cortex and their potential correlation to cognitive phenomena. Exp Brain Res 137: 346–353. pmid:11355381
  56. 56. Hoppensteadt FC, Izhikevich EM (1998) Thalamo-Cortical Interactions Modeled by Weakly Connected Oscillators: Could the Brain use FM Radio Principles? BioSystems 48: 85–94. pmid:9886635
  57. 57. Akam TE, Kullmann DM (2012) Efficient “Communication through Coherence” Requires Oscillations Structured to Minimize Interference between Signals. PLOS Computational Biology 8(11): e1002760. pmid:23144603