Figures
Abstract
Ultra-slow fluctuations are a hallmark of spontaneous cortical activity. We examine the hypothesis that these dynamics arise from recurrent neuronal networks operating near a phase-transition point, a state marked by “critical slowing down”. In such networks, a subtle shift toward criticality should selectively amplify slow fluctuations, providing a lever that can switch the cortex from quiet rest into self-generated behavior. Using a simple random recurrent network, we reproduce this amplification effect. The resulting spectra closely match intracranial electroencephalography from human visual cortex recorded during rest and during category-specific visual free recall. In particular, the model captures the experimentally observed enhancement of slow fluctuations during recall. These simulations reveal a parsimonious mechanism that explains spontaneous ultra-slow activity and enables rapid transitions between spontaneous states, suggesting that dynamic tuning toward criticality may be a general strategy by which cortical networks enter a generative mode.
Author summary
Our brains never stand completely still: even at rest, neural activity drifts in very slow waves that last seconds to minutes. Where do these sluggish rhythms come from, and how do they help us shift from quiet rest into free, creative thought? We explored these questions using a minimalist computational model. The model is a network of simple “neurons” whose connections can be strengthened or weakened by a single gain knob. When the gain is set just below a critical tipping point, the network exhibits “critical slowing down,” creating ultra-slow fluctuations like those measured in real brains. Nudging the gain only slightly higher selectively boosts the slowest waves without destabilizing the network. This two-step behavior reproduces human intracranial recordings: the power at slow frequencies rises markedly in visual cortex when people freely recall images compared with passive rest. Our results suggest that cortical circuits normally hover near criticality and can edge even closer on demand, using critical slowing to shift rapidly from baseline activity to a generative mode that supports recall and other spontaneous, free behavior.
Citation: Yellin D, Siegel N, Malach R, Shriki O (2025) Adaptive proximity to criticality underlies amplification of ultra-slow fluctuations during free recall. PLoS Comput Biol 21(10): e1013528. https://doi.org/10.1371/journal.pcbi.1013528
Editor: Timothée Proix, University of Geneva Faculty of Medicine: Universite de Geneve Faculte de Medecine, SWITZERLAND
Received: March 14, 2025; Accepted: September 16, 2025; Published: October 28, 2025
Copyright: © 2025 Yellin et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: This study is based entirely on computational modeling and does not include experimental data. All code required to reproduce the simulations, power-spectral analyses, and figures is openly available at: https://github.com/dovi-yellin/critical_slowing_down_in_free_recall The repository includes a fully reproducible workflow and step-by-step instructions for recreating the results presented in the paper.
Funding: The study was funded by Israel Science Foundation (ISF) Grant 794/22 to OS. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
Introduction
Ultra-slow spontaneous activity fluctuations, also known as “resting-state” activity [1], have been consistently observed across all studied cortical regions [2–4]. The power spectra of these fluctuations have been characterized as obeying a power-law profile [5–7] and they emerge in well-defined networks across the cortex, uncovering unique patterns of functional connectivity [3,8,9]. These spontaneous fluctuations have also been linked to ramping anticipatory signals, as well as other spontaneous behavioral manifestations, such as pupil diameter fluctuations [10] and eye-drifts [11]. We previously proposed [12,13] and recently demonstrated, a direct link between the spontaneous activity fluctuations and free and creative verbal behaviors [14,15]. However, it is still not clear how the characteristic power spectrum of the spontaneous fluctuations - dominated by ultra-slow frequencies - emerges out of the typically fast neuronal elements. Several non-recurrent mechanisms were proposed, including protracted synaptic currents that may integrate inputs over slow timescales [16], and power-law spike-frequency adaptation introducing slow dynamics [17]. We and others have previously suggested that these slow dynamics may emerge from recurrent neuronal connections [12,18–22].
Near-critical dynamics as a potential mechanism. Here, we extend this notion and examine the hypothesis that the cortex operates in the vicinity of a critical transition point [23–26]. In such a system, the slow resting state fluctuations can be explained in the context of a fundamental property of near-critical recurrent systems - the slowing down of stochastic fluctuations as these systems approach criticality. Termed “critical slowing down” [CSD; 27–29], the phenomenon describes the universal tendency of systems near a transition point in their dynamics to exhibit slower time scales and take longer time to relax to a steady state equilibrium following perturbations. Hence, the slowing down of spontaneous fluctuations near criticality may explain their unique power spectra as observed in the resting state networks. Indeed, some prior modeling work has tentatively proposed such a link [30,31].
Recent experimental and theoretical evidence supports the idea that the healthy cortex operates near a critical point, exhibiting slow time scales and scale-free dynamics [31–37]. Scale-free spatiotemporal cascades of neuronal activity, termed neuronal avalanches, have been found in in-vitro and in-vivo animal studies as well as in large-scale human fMRI, EEG and MEG recordings [32,38–40]. These avalanches can be well described by the framework of critical branching processes [41] and they can coexist with oscillations [42]. Other studies indicate edge-of-chaos criticality, in which the system hovers not on the brink of explosive growth but rather in a regime of richly chaotic yet bounded dynamics [43–45]. Another type of critical transitions discussed in the literature is pattern formation criticality, which refers to a transition into self-organized spatial patterns of activity, such as stationary bumps or traveling waves [22,46–48].
Importantly, near-critical dynamics optimize information processing [26,49–51] and deviations from critical dynamics are associated with disorders in information processing [52–55]. We have previously demonstrated the phenomenon of CSD in a recurrent network model of early visual processing, where the effective time scales grow longer as the network approaches a critical point during its learning process [22]. For a recent review of critical brain dynamics, see [23,56].
A straightforward prediction of the near-critical dynamics is that bringing the system closer to criticality should result in biasing the power spectra of the spontaneous fluctuations towards even slower frequencies. Indeed, recent experimental findings derived from direct recordings of brain activity in patients are compatible with this notion. Thus, in a study exploring brain mechanisms of visual recall, Norman et al. [13] found that resting state fluctuations displayed an increase in baseline neuronal firing rate and power of the slow spontaneous fluctuations during free recall. This local baseline shift - i.e., increase in average firing rate - was predicted to bring the targeted neurons closer to the decision threshold and enhance the probability of free recall during such periods.
It is important to emphasize that this amplification of the fluctuation amplitude can serve a significant cognitive and functional role. Free behavior relies on the ability to rapidly and flexibly switch between active and rest conditions. For instance, musicians can easily switch into free improvisation mode from a state of rest. If the slow spontaneous fluctuations indeed underlie free behaviors, a mechanism must be present in the brain to regulate these fluctuations. In the study of Norman et al [13], exploring this idea in the domain of free visual imagery and recall, it was found that resting state fluctuations displayed an increase in baseline and power of the slow spontaneous fluctuations specifically when subjects switched from rest to an active recall state. Here, we propose that this local dynamical shift, which increases the probability of targeted neurons to cross the decision threshold, can be elegantly explained by the CSD phenomenon.
Specifically, we propose that the related amplification of the spontaneous fluctuations may be due to bringing the network closer to its critical point, thereby leading to CSD. This provides a mechanism for the efficient implementation of a state-change from rest to active, yet spontaneous, behaviors. We further speculate that the proximity to criticality is controlled through neuromodulation, allowing dynamic regulation of cortical states
Network model and tuning of proximity to criticality. To explore the suggested mechanism, we simulated a random recurrent network of rate-based units at varying proximities to criticality. These rate models can be rigorously derived from large-scale conductance-based networks that model cortical dynamics [57]. Our network architecture is adapted from the one proposed by Chaudhuri et al. [30] to simulate intra-cranial EEG (iEEG) dynamics. Proximity to criticality is controlled by tuning relevant biophysically meaningful parameters. The precise tuning is determined by analyzing the recurrent interaction matrix using considerations from random matrix theory assuming a large enough network size, typically in the order of a few hundred neurons [58–60]. Near-criticality is required to reproduce the observed scale-free power spectrum of iEEG recordings [30]. Setting the model closer to its critical point may account for the unique dynamics observed during free recall [13].
To provide further insight, Fig 1 schematically illustrates CSD in a random recurrent neural network and in a one-dimensional non-linear system near its critical point. Throughout this work, we refer to “criticality” in the following dynamical-systems sense: in the subcritical domain, perturbations decay and spontaneous activity remains bounded, whereas in the super-critical regime even small fluctuations grow exponentially, driving the mean firing rate toward saturation. This mean-field, non-oscillatory bifurcation yields avalanche-like dynamics rather than the edge-of-chaos or pattern formation transitions discussed above. Panel 1a shows the expected activity in a random recurrent network at varying proximities to its critical point, as set by the network gain parameter, . The critical point is defined when
(see Methods for further mathematical substantiation). For a transient stimulus, activity will dissipate when the network gain is set to
(blue arrow), diverge to the maximal neuronal rate (in black dashed line) when
(green arrow), and tend to persist at enhanced slow fluctuation frequencies when
, namely slightly below the critical point (yellow arrow). For a continuous signal, the steady-state power spectrum indicates an increase in the amplitude of slow activity only for
.
The figure provides an illustrative and simplifying explanation of the critical slowing down (CSD) phenomenon in the dynamics of a recurrent system. CSD takes place as the system crosses between its sub-critical domain (G < 1, here depicted in blue) and its super-critical domain (G > 1, here in green). In this near-critical state (G 1, here depicted in yellow), when the system’s gain (also termed ‘control’) parameter is increased to the vicinity of (but still below) the critical point, it displays cardinal slowing down and amplification of its dynamics. Panel a delineates the expected activity in a random recurrent network at varying proximity to its critical point. For a transient stimulus, activity in the network will (a) dissipate in the sub-critical domain (G < 1, in the bottom sub-panel) (b) diverge to the maximal neuronal rate (black dashed line) in the super-critical domain (G > 1, top sub-panel) and (c) tend to become persistent as the gain approaches G
1 (middle sub-panel). For a continuous signal, the power spectrum will indicate rise in the amplitude of slow-wave activity only at the domain of G
1. Panels b and c provide further illustration of the CSD phenomenon exploring the dynamics of a simple one-dimensional non-linear system, driven by the quadratic equation of
. Panel b depicts the system and its solutions as the value of
, the control parameter, is modified between super-critical (r > 0, in green), sub-critical (r < 0, blue) and near-critical (yellow, r
0) states. Panel c shows how the system dynamics slows down as r approaches 0. The curve corresponding to r > 0 diverges exponentially. See a more detailed discussion in the text.
Panels 1b & 1c offer a didactic illustration of CSD in a simple one-dimensional system. The dynamics are characterized by a simple quadratic equation: , where
is a time-dependent variable satisfying
, and
is the system’s control parameter. For
the system is super-critical. At this domain, there are no fixed points and
always diverges to infinity. At
the system is sub-critical and two fixed points arise at
(panel 1b; filled circles correspond to stable fixed points and hollow circles to unstable fixed points). Close to a given fixed point, the dynamics can be linearized and the deviation from the fixed point can be approximated by an exponential temporal profile,
, where
(note that for stable fixed-points the derivative is negative and the dynamics converge). The magnitude of the time constant,
, that controls the dynamics approaches infinity as the system approaches its critical point (where
). Thus, the dynamics slow down as the system approaches the critical point, as illustrated in panel 1c. Furthermore, at the bifurcation, the linear term vanishes, and the dynamics become dominated by higher-order terms, leading to power-law scaling behavior. This transition to power-law dynamics is a hallmark of critical systems, reflecting the absence of a characteristic timescale and the emergence of scale-invariant fluctuations
In summary, we propose that the unique dynamics observed by Norman et al. [13] may be accounted for by a closer proximity of the local cortical network to a critical point during free recall. This enhances the gain of the local network, making it more sensitive to noise and driving it closer to the decision bound. We propose a model of a randomly connected recurrent network subject to a white noise input for this putative mechanism. Our key hypothesis is that this amplification in the slow activity amplitude stems from the CSD phenomenon, and can be obtained by a relatively small shift towards criticality. It is likely that any model that manifests such critical phase transition behavior and power law spectra would be equivalently sufficient. Our findings support the hypothesized slowing down modulation near-criticality, showing good fit with the empirical data. They support the hypothesis that the phenomenon of ultra-slow, scale-free resting state fluctuations and their modulation during intrinsically motivated tasks reflect a basic and ubiquitous behavior common to all suitably configured recurrent systems.
Results
To model the iEEG signal, we simulated a sparse random recurrent network, realizing a firing rate model (Methods). The summed activity of a sample of units represented the corresponding iEEG [30]. The proximity to the critical transition between stable and unstable dynamics is quantified by the control parameter
,
where is the neural gain,
is network sparseness and
the mean synaptic strength. This relationship is obtained using tools from random-matrix theory to evaluate the spectrum of eigenvalues of the matrix that governs the linearized network dynamics [Methods; 30,59]. At
, the dominant eigenvalue crosses from negative to positive, leading to a transition into runaway (supercritical) activity. To satisfy the large-network assumption underlying this analysis, the model size was set to
units (Methods; Appendix B, Fig A in S1 File).
Throughout the main simulations was fixed at 0.2 and
at 49.881
(following [30]), while
was varied to move the system toward or away from the critical point. The time constant (
) in these simulations was set to 20 ms, which is close to well-known estimates for cortical neurons’ membrane time constant [61–63]. A sampling parameter,
, defined the fraction of network nodes summed from the model to produce the simulated iEEG, and background drive was provided by a white-noise input,
(Methods).
To study the model dynamics, we also used an additive white-noise current, . This was motivated by
. Tuning just these two parameters, independently, the neural gain,
, and the additive white-noise,
, enabled us to match the slow-frequency power spectrum and reproduce the observed shift from rest to free-recall (see Methods and Fig 3).
Empirical iEEG results during wakeful-rest (adapted from [7]), presented here in panels a and b, are compared and contrasted with corresponding fitted simulation results in panels d and e, respectively, from an artificial neural network model. Panel a presents the HFB signal from homological left and right auditory cortices. Panel b presents the autocorrelation (black line) and the cross-correlation (orange line) of the HFB PSD obtained across the two auditory cortex hemispheres. Panel c presents simulation analysis results placing computational bounds on expected values of the control parameter G, as it grows from the sub-critical to the critical range. The upper part of the panel shows the simulation results for maximum (blue line) and average (green line) pair correlation, placing a bound (red interval) on the G (at the range 0.91 – 0.96) for the simulated correlations to match empirical results in panel a. The lower part of panel c shows the log ratio of simulated auto- to cross-correlation mean power at low frequency (orange line) indicating best match to empirical results at G*~ (red cross). Panel d demonstrates a corresponding signal from two highly correlated units in the artificial network at G*. Panel e depicts, for comparison, our simulation results of the activity fluctuations’ mean auto- and cross-correlation at G*. The location of the knee frequency in the autocorrelation spectra (at ~1 Hz) of the empirical and simulated data is further compared in Appendix B, Fig B in S1 File.
The analysis unfolds in two stages. First, the model is calibrated to the resting-state HFB spectra measured by [7], which establishes quantitative bounds on how close the network can lie to the critical point. Working within that window, we then apply the same model to the free-recall data of [13]. We demonstrate that a modest gain increase, well inside the limits set by the data of [7], reproduces the baseline amplification and ultra-slow power boost observed in target populations. This progression from rest to task links the two empirical datasets highlighting how proximity to criticality can enable rapid, goal-directed reconfiguration of network dynamics.
To that end, we first examined whether our network model could reproduce the dynamical characteristics of the spontaneous resting state activity fluctuations as recorded in the human cortex, and found that it did when set close to its critical point. This is demonstrated in Fig 2, which compares real iEEG recordings and simulated network dynamics. It has been previously shown [64–66] that the power of the LFP in the high frequency broadband (HFB) provide a reliable index of the average firing rates of neurons within the recording site. Note that this HFB measure should be distinguished from the direct recordings of the LFP signal itself. Panel 2a shows high correlation levels (r = 0.56 Pearson) in spontaneous firing rate fluctuations based on HFB power from two auditory cortex electrical contacts recorded across hemispheres during rest (obtained from [7]). The autocorrelation (black) and cross-correlation (orange) PSD (often referred simply as PSD and cross-PSD) of the HFB obtained from multiple auditory cortex sites across the two hemispheres during rest in Nir et al are shown in panel 2b.
To investigate the influence of the proximity to criticality on the typical correlation levels between pairs of units in the network, we simulated multiple network realizations, gradually increasing the control parameter in each realization from the subcritical ( = 0.8) to the near-critical (
= 0.99) range. The upper part of Panel 2c presents the simulation’s maximum and average pair correlation in blue and green, respectively. Results indicate that the network first reaches the correlation level observed by Nir et al only at
= 0.91 (blue curve), and that this becomes the mean correlation level at
= 0.96 (green curve), setting a bound on the operating range of
(see red interval).
Further “fine-tuning” of the actual value of the control parameter was established by computing the ratio of the cross- and auto-correlation power at low frequencies as the system approaches criticality. Previous work demonstrated that spatial cross-correlations arise more markedly than temporal autocorrelation during CSD [29,67]. The lower part of panel 2c shows the simulation results of this computation, in the range
= 0.8 to 0.99, over multiple realizations of the model network. Our findings show that the autocorrelation is less influenced by the growth of
relative to the cross-correlation, and thus, their exponent ratio changes rather steeply as a function of
(orange line), providing the approximation
*
(see red cross) for the log ratio observed in the empirical findings (0.84). Panel 2d demonstrates correlated activity fluctuations of two simulated units in the recurrent network (corresponding to Fig 2a), whereas panel 2e depicts the simulated auto- and cross-correlation PSD results for this estimated value
. For later reference, and given the inherent difficulty in determining an exact value for
, we define a typical resting-state range,
, within which
is expected to lie. Note that results for other values of τ were explored to optimize the fit of the “knee” bend in the PSD curve as presented in Appendix B, Fig B in S1 File, showing that a τ value of 20 ms provides better fit than alternative settings at 2 or 200 ms. Note also that in the very near vicinity of the critical point (
) the slow mode takes hold of neuronal correlations in the network, leading to mean correlation
.
Our simulation focused on the low-frequency range, where it faithfully reproduced the auto- to cross-correlation power ratio found in Nir et al. (as demonstrated in panel 2c). We did not attempt to optimize the mid- to high-frequency fit (panel 2e relative to 2b). At present, the model captures autocorrelation power more accurately than cross-correlation; refining this aspect remains for future work. The sole purpose of the simulation based on Nir et al. was to bound the admissible values of the global gain parameter, , establishing the range later used to model the rest-vs-recall shifts observed by Norman et al. [13].
Next, we examined whether the recurrent network simulation using the same network size (240 units) and a similar parameter setting (as presented in Fig 2) could also fit the baseline shift and amplitude amplification of the slow spontaneous fluctuations observed during targeted free recall behavior, as reported in [13]. In this regard, the parameters ,
and
remained unchanged from the previous simulation.
Fig 3 compares the intra-cranially recorded HFB baseline shift amplification with the simulated neuronal activity. To that end, few remaining parameters of the simulated network were again fitted for optimal power spectrum match with results obtained by the intracranial HFB signal PSD. As for the setting of the decay constant, in addition to the optimal knee-location fit obtained in Fig 2 (indicating τ = 20 ms), another approximation method was employed, a Lorentzian approach, following [30]. The result of this approximation is shown in Appendix B, Fig C in S1 File. Thus, we establish once again, now based on two distinct methods, that setting this parameter near the biophysically-grounded-estimate of the membrane time constant optimizes the fit of the knee frequency location, and here also of the PSD curve for mid-range frequencies (> 0.1 Hz and < 10 Hz). To adjust the curve slope in the higher frequencies (> 10 Hz), the sampling fraction was set to α = 0.01. The simulation was run for 1800s, with the first 600s omitted as a stabilization period (by means of signal stationarity testing) before analysis of the remaining 1200s.
Panel 3a depicts an example of the experimentally derived spontaneous HFB fluctuations in two neuronal populations - one that was targeted to elicit free recall (red) and the other that was not targeted (green). The baseline shift of the fluctuations for the targeted area is readily visible. It should be noted that in [13], a generalized enhancement of slow fluctuations was found for the free recall conditions; however, the study did not find clear evidence for a selective enhancement in specifically targeted vs non targeted categories, likely due to the mixed category selectivity and low SNR of the clinical recording sites. By contrast, in the model simulation, we assume a fully selective neuronal group - and the simulation results should be taken as a prediction of such category-selective effects in future studies. For comparison, Panel 3b shows the simulated activity fluctuations for a network when a uniform baseline (fixed value) shift, (see Methods), was added to the white noise inputs (red) and without this added fixed value shift (green). Note that the gap between the two curves can easily be fine-tuned via setting the relative value and ratio of
and
, without compromising other measurable properties. Importantly, similar amplification in the baseline firing rate was observed also by increasing the network gain, γ, towards the critical point, implying a minor 1% change of the control parameter
.
Panel 3c provides a more comprehensive analysis by showing the mean PSD across all intra-cranially recorded sites in the targeted neuronal population (red PSD) vs. the mean PSD across all sites in the non-targeted population (blue PSD). To make the comparison of these neuronal recordings to the simulations easier, we overlay the simulated results on top of the brain recordings in panel 3d. In this panel, the mean and standard deviation for multiple realizations (specifically, 8 arbitrary random-seed network connectivity initializations) of simulated target (yellow) vs non-target (green) blocks are presented. Here, the fit in PSD was reached by adding a fixed value shift of 12.5 pA to the white noise during simulated targeted (orange thick curve) but not to the non-targeted blocks (green thick curve). The control parameter was set within the
range, as in Fig 2. As in [13], the increase in the power of the ultra-slow modulation in the simulated PSD amplitude during recall relative to rest blocks over the different realizations was statistically significant (for the range < 0.2 Hz; p < 0.05, FDR corrected, signed-rank test). Goodness of fit analysis indicated that the simulated network (mean PSD, for both rest and recall conditions) captures a relevant segment of the empirical spectra (in the range 0.05 - 10 Hz) rather well (log-domain
= 0.95); after correcting for the autocorrelation between neighboring frequency bins (
= 14 vs. 256 raw points; see Methods), the corresponding two-tailed t-test indicated significance (
,
).
Importantly, a comparable amplification of the targeted ultra-slow frequency PSD was also achieved through direct modulation of the synaptic connectivity matrix. Notably, increasing the network gain, γ, by 1% within the scope of * was sufficient to replicate the empirically observed enhancement of slow frequencies during free recall. In Appendix B, Fig D in S1 File, we show a distinct single realization PSD result from these simulations using the same color scheme mentioned above. Note the similarity in modulations of the PSD, and, specifically, the amplification in the ultra-slow-frequency domain of the PSD, evident in the targeted relative to non-targeted simulated blocks.
In the following subsections, we systematically examine the phenomenon of CSD in our recurrent random network model, focusing on its fundamental properties rather than fitting specific data. Accordingly, some parameter settings are adjusted freely to enhance clarity in the illustrations.
To examine the impact of network connectivity on the PSD over a broader modeling range and greater detail we have systematically examined the influence of changing network and connectivity parameters over a wide range starting from well below the dynamical critical point. This is depicted in Fig 4. In panels 4a-b, a simplified network setting with 100 units was used to explore the influence of the control parameter on the spectra, via gradual wide range adjustments of γ, while keeping all other parameters fixed. The simulation lasted 800s, with the first 200s omitted in the analysis to allow signal stabilization; τ was set again to 20 ms, and α was set to 1.0.
Panel 4a shows the effect of setting over its full range from 0.0 to 1.0, whereas panel 4b focuses on the narrow range near the critical point. Three key findings emerge from this analysis: first, a gradual increase can be observed in the amplitude of slow frequencies that accelerates near the critical point of
(when
); second, the medium to high frequency range of the PSD remains unchanged even when all network connections are set to zero, effectively isolating the neurons from each other; and third, the slow-frequency PSD slope for
* (green line), which is, interestingly, close to the scale-free exponent of negative 2 [5]. In panel 4c, the PSD results for the same simulation as in Fig 3 are shown, with the control parameter in vicinity of
*, but with gradually increasing levels of baseline shift of the noisy input,
. Similarly to the modulation of connectivity strength towards criticality in
, when the network is already near-critical, additive noise leads, as well, to a slowdown in the ultra-slow frequencies. To further investigate, we compared the influence of additive noise on the slowing down phenomenon in networks at varying proximities to the critical point. Fig E in Appendix B in S1 File shows that additive noise significantly increases the amplitude of low frequencies when the network is near the critical point but has practically no effect when the network is far below the critical point.
The emergence of a scale-free power-law in the fluctuations of simulated activity (even in individual isolated neurons, for mid-to high frequencies) raises the concern that the network behavior merely reflected the dynamical structure of the injected input noise. To examine this possibility, we modified the injected noise profile by high-pass filtering and observed the effect on the network PSD for parameters near criticality. The simulation was 800s long, with the first 200s excluded for signal stabilization. The time constant τ was set here to 20 ms, as before. The control parameter, , was set in vicinity to criticality and α was set to 1.0. Fig 5 displays the simulation results. We also used the same noise profiles to simulate isolated neurons (zero-connectivity network), with the resulting PSD shown in Appendix B, Fig F in S1 File.
iEEG HFB signal from targeted free recall target (red) and non-targeted (green) neuron populations (adapted from [13]) in panel a, is compared with the simulated spontaneous activity in “recall” target (with additive noise) vs non-target “rest” blocks in panel b. The power spectra obtained from all intra-cranially recorded free-recall target sites (red) relative to the non-targeted population (blue) is shown in panel c. Simulated results are overlaid on the empirical PSD results in panel d, showing near fit, in expectation, over multiple realizations. The mean and standard deviation are shown for the free-recall target (yellow) and non-target (green).
Panels a and b demonstrate the CSD of low-range frequencies in an arbitrary network setting as the control parameter approaches the value of 1 (colors indicate different values of ). In panel a, the cases of 0-connectivity (i.e.,
= 0) and of
* are included (blue and green bold, respectively). In the inset (panel b), the gradual verge-of-criticality slowing down is highlighted. Panel c shows a similar amplification of slow frequencies, here as a function of the amount of additive noise (see Methods
) overlaid on PSD results from [13]. Importantly, the network is set at the vicinity of criticality (
*). Note that the different scales used on the ‘Power’ axis in the different panels are aligned with the relevant references (panels a and b are scaled to match [30], whereas panel c follows the scaling used in [13]).
Panels a-b demonstrate the simulated near-critical network PSD when the injected noise is high-pass filtered (panel a) relative to the initial unfiltered white noise (panel b). Insets c – d present the noise spectra in use for each of the main panels, respectively.
As can be seen, even drastic changes in the spectral power profile of the input noise had only minor impact on the PSD low frequency power-law profile of the network, leaving the mid-to-high profile nearly unchanged. This is evident in the striking discrepancy between the PSD profile of the input noise (Fig 5 insets c-d) and the PSD of the entire network (panels 5a-b). Note a bump at ~3 Hz, a consequence of setting the high-pass filter’s cutoff at that frequency. Of particular interest is the insensitivity of the network even when all slow input frequencies were drastically quenched to a tenth of their original power (see panel 5a relative to 5c). Similar behavior for the PSD of mid-to-high range frequencies is preserved also in the isolated neuron network (Appendix B, Fig E in S1 File).
Finally, to analyze the influence of network size on the amplification of the spontaneous fluctuations’ slow frequencies as the network approaches criticality, we scanned the parameter space of both network connectivity and size. In the analysis, we iteratively generated network realizations of larger and larger scale (150–350 units), and increased the control parameter G (via γ) from subcritical to near-critical range, while collecting the slow (normalized by fast) frequency power values.
The results are shown as a two-dimensional heat map (Fig 6). The x and y axes in this map depict network size and control parameter settings, respectively, whereas the color per pixel denotes the degree of slow frequency amplification (more details in Methods). The heatmap clearly demonstrates that modulation of the control parameter towards criticality is the dominant factor driving the slow-frequency power amplification effect. A mild trend for larger amplification in slow frequency power is shown for network size > 200. This trend appears to become negligible as networks grow larger.
The heatmap illustrates that amplification of normalized low-frequency power is driven mainly by the network’s proximity to criticality, as quantified by the control parameter, G.
Discussion
Ultra-slow spontaneous fluctuations are one of the most widely studied phenomena of the human brain. Numerous studies have revealed that these fluctuations are organized along highly complex but informative functional connectivity patterns [3,68–70] suggested to be related to cognitive habits, traits and pathologies [71–73]. It is also proposed that they drive self-generated, voluntary, and creative behaviors [13–15,74].
Despite extensive research on the properties and dynamics of these slow fluctuations [1,3,7,75], the underlying neuronal mechanism remains unknown. The slow dynamics and intricate organization of these fluctuations may suggest a complex underlying mechanism, but we argue that the opposite is true. We hypothesize that critical slowing down (CSD), a fundamental principle in the dynamics of recurrent networks, underlies the behaviors of the ultra-slow spontaneous fluctuations during both rest and task conditions.
Using a simple random recurrent network near criticality and unstructured white noise inputs, we successfully simulated the CSD dynamics and reproduced the power spectrum of the spontaneous (resting state) activity fluctuations as reflected in the intracranially measured HFB. Furthermore, a small uniform increase in synaptic efficacy, or alternatively, a small DC shift in this noise input, near the critical transition point of the network dynamics, were sufficient to simulate the amplitude enhancement documented using intra-cranial (iEEG) recordings during targeted free recall in the visual cortex. The idea that changes in the efficacy of recurrent interactions can modulate local intrinsic timescales was also explored in the context of selective attention [76]. Specifically, their computational modeling work proposes that increased intrinsic timescales during selective attention are associated with a shift of network dynamics closer to a critical point.
Recently, modeling of stochastic noise coupled with slow synapses in a recurrent spiking network successfully simulated the slow buildup anticipating voluntary movements [77]. Wider-scale whole-brain modelling indicates that the richness of resting-state dynamics may arise from a delicate balance between neuromodulatory gain and stochastic drive. In a primate connectome model, small amounts of noise let the network roam a repertoire of metastable configurations beyond those accessible in a purely deterministic regime [78]. A later study demonstrated that the same gain-noise interplay breaks symmetry and carves out a low-dimensional manifold matching empirical co-activation patterns [79]. Our minimal network formalizes this mechanism by demonstrating analytically how tuning a single gain parameter near the critical point both elevates noise sensitivity and stretches time scales into the ultra-slow domain. Complementary studies reveal that nudging gain toward criticality [31] or allowing inhibitory plasticity to self-tune this balance [80] likewise promotes realistic, noise-driven spontaneous activity, underscoring the generality of the framework we provide.
It is important to clarify that our model was chosen not by virtue of its likely optimal fit to the experimental data - but rather because it was the most basic model of a recurrent network we could conceive of. The model was previously shown to account for the 1/f envelope of the power spectrum of the raw iEEG data [30]. Furthermore, a major advantage of the simplicity of this model is that it allows for explicit control of the proximity to the critical point. By controlling the network gain, we can move a sub-critical model to a model that is near the critical threshold and explore the changes in the dynamics. In this respect, we effectively employ a spectrum of models, namely various random recurrent networks, of different levels of connectivity and size that generate iEEG like activity, at varying proximity to the critical transition. The crucial point is that only under parameters that enable CSD, namely near criticality, were we able to replicate the experimental data, and specifically the amplification in ultra-slow fluctuations.
Based on these simulations, we propose that the primary driver of slow spontaneous fluctuations in neuronal activity is stochastic, likely, but not necessarily, balanced noise that critically accumulates through a highly specific, recurrent, connectivity structure. The exact source of this noise is unknown and could originate from various processes. One possibility involves intrinsic neuronal mechanisms, such as thermal, synaptic, and ion channel fluctuations [81,82]. These processes could collectively contribute to the random membrane potential fluctuations consistently observed across all types of cortical neurons [1,82,83]. Another possibility is that the power spectrum of incoming spiking activity from subcortical sites exhibits characteristics of white noise. A recent study on activity in the subthalamic nucleus demonstrated that while the spectrum of synaptic activity follows a power law with an exponent around -2, the spectrum of the resulting spike rate is nearly flat [84]. It is important to clarify, that our model equations capture the dynamics of synaptic activity, which is the primary contributor to iEEG signals [85].
In this work, we simulated two key features of spontaneous fluctuations: their power spectra on the one hand and modulation during an intrinsically motivated task, on the other. However, we did not address how complex functional connectivity patterns may emerge from such noise fluctuations. We have previously suggested that these patterns are shaped by synaptic structures and reflect prior learning [73,86]. This hypothesis predicts that a recurrent network with updatable connectivity based on suitable learning rules will spontaneously reproduce a repeatedly activated specific pattern when network neurons, having learned the pattern, are injected with white noise near criticality. While we focus on the power spectra and modulation during task-induced targeting of categorical boundaries, future simulations could investigate the emergence of such functional connectivity patterns.
Our simulation focused specifically on ultra-slow fluctuations and their targeted amplification. While other aspects of the [13] study are intriguing, they may require more sophisticated network models and additional analyses. The dynamics of the recurrent network when crossing the critical point, and its relationship with the concept of ‘ignition’ [12], is another fascinating area of study. However, these aspects are beyond the scope of our current work and are left for future exploration.
As noted above, our model does not attempt to capture the full extent of the task and the consequent behavior. Rather, here we focus on the slowing down and amplification of the overall activity fluctuations. Our assumption is that shifting the network closer to criticality or enhancing the external noise may transition the network into a generative mode, where stored activity patterns are more likely to cross a relevant threshold. The advantage of the model in our context, is that the proximity to criticality can be directly controlled, allowing for a systematic examination of its effects on the resulting power spectrum.
Earlier studies have demonstrated the emergence of a scale-free 1/f power-law spectrum using a rate model simulation near criticality [30,61]. However, they focused primarily on the tipping point of criticality () of the network parameter setting. In contrast, we aimed to investigate the minimal conditions required for the scale-free PSD to arise, examining network connectivity weights far below the critical point, as well as the impact of the input noise profile on the network’s neuronal activity.
Intracranial recordings of single neurons and the high frequency broadband (HFB) amplitudes of the LFP, which has been shown to index the average firing rate of cortical neurons [64,66], reveal the power spectra of the spontaneous fluctuations in the human cerebral cortex during rest (panels 2c and 3c). These spectra consist of two power-law exponents, with slower frequencies following a shallower exponent than higher frequencies. Remarkably, a similar profile emerges in a random recurrent network with fully stochastic connection strengths and input noise under the appropriate parameters (compare to panels 2d and 3d).
These results were obtained using minimal, biophysically-oriented, parameter settings, consisting of a physiologically plausible membrane time constant (~20 ms), sampling of only a small part of the network nodes (α), which is compatible with the localized nature of the empirical recording sites, and control of the system’s slowing down by a single gain parameter ().
We first discuss the high frequency slope of the HFB PSD. Analyzing the behavior of the network under zero connectivity (panel 4a) revealed that the resultant fluctuations in membrane potential approximated a power-law beyond a certain “knee” frequency, even when the spectra of the stochastic noise injected into each neuron was flat (white noise) or biased towards higher frequencies (see Methods and Appendix B, Fig E in S1 File). Thus, each unit, when decoupled from the rest of the network, maintains the slow frequency band and attenuates the higher frequencies in a scale-free manner, following a Lorentzian power spectrum ([30] and Appendix B, Fig C in S1 File). A mathematical analysis presented in SI section 1 shows that the dynamics of a scale-free power spectrum with an exponent of negative 2 is indeed the expected result for the fast-mode HFB PSD. Recent studies of activity fluctuations in isolated cortical and hippocampal neurons also reveal that individual neurons are capable of generating stochastic fluctuations obeying scale-free power-law profiles as well as long time-scales, even in the absence of network connections [87–89].
Examining the impact of network connectivity on the spontaneous fluctuations reveals a specific amplification of slow frequencies’ amplitude in the spontaneous fluctuations spectra as the network approaches its dynamical phase transition point. This is shown when separately examining the impact of changing the connectivity strength, i.e., the efficacy of intra-neuronal communication weight, and the richness of connectivity as reflected in network size (Fig 6). A successful amplification of slow frequencies can only be achieved within a regime in which these two parameter spaces overlap. However, only random network configurations were explored in this study, and a better structured connectome arrangement, whether shaped by innate prearrangement or as learned by prior experiences, may present other near-critical behaviors. Nevertheless, it should be noted that even in extremely small, properly configured networks possessing fewer than 20 neurons, a powerful amplification of ultra-slow fluctuations can be obtained, suggesting that ultra-slow fluctuations lasting many seconds do not necessarily depend on large-scale cortical networks for their generation, as also suggested by previous work [90].
Our simulation results are compatible with the notion that the spontaneous fluctuations can be selectively amplified when behavior switches into a free recall state. Furthermore, they predict that when participants attempt to visualize different items belonging to a certain category of visual stimuli, their slow spontaneous fluctuations should be selectively amplified, specifically in the corresponding neuronal representations. This can happen through a modulation towards criticality via connectivity gain or via additive external noise (see panels 3a and 3c and more details in [13]). Such amplification enhances the probability of these fluctuations to cross the decision bound, specifically in the category-selective area. This modulation toward a critical threshold thus provides an elegant mechanism for rapid and flexible control over the expression of any spontaneous behavior within an intentionally selected categorical boundary. For example, when we attempt to freely imagine different types of faces, we rarely, if ever, mistakenly picture a building. As we demonstrate here, modulation of proximity to criticality and the resulting targeted CSD offer a plausible mechanism for such precise cognitive distinctions. However, it should be emphasized that such specific, category selective, enhancement of slow frequencies has not been demonstrated experimentally yet, likely due to low selectivity and poor SNR of the recordings available in [13] study. Thus, our simulation of this result should be taken as a prediction for future studies with a higher spatial resolution.
When local populations’ activity fluctuations transiently cross the decision bound, this may lead to rapid self-amplification termed “local ignition”. Such ignitions have been experimentally observed during crossing the perceptual awareness threshold using iEEG recordings [12,91]. Additionally, near a critical point, network sensitivity increases, enabling better encoding and recall of information through balanced response to inputs and adaptation to noise (M. [22,92]).
Our simulations of a random recurrent network demonstrate that a simple shift in noise level or in gain modulation accurately reproduces the intra-cranially recorded slow frequency amplification observed in the empirical data during category-selective visual recall (panel 3d and Appendix B, Fig D in S1 File, respectively). The key factor inducing the amplification effect is the network’s proximity to its critical transition point, regardless of the specific mechanism controlling its excitation.
It is important to note that empirical brain data and recurrent network simulations specifically relate to fluctuations in neuronal firing rate or, in cases where such measures are unavailable, to the broadband high frequencies (gamma) envelope of the LFP. Previous studies have shown that the gamma envelope reflects fluctuations in the average firing rates of local neuronal populations [64,66]. These measures of firing rate fluctuations should be distinguished from fluctuations in the local field potential, i.e., the LFP, itself. Thus, it should be emphasized that the power-spectrum of the LFP should not be confused with the power spectrum of the firing rate or HFB fluctuations, as the two types of fluctuations can even be antagonistic to each other under different experimental paradigms [93].
Moreover, previous studies have suggested that the LFP power spectrum is optimally fitted by setting the dynamics control parameter precisely at the tipping point of criticality (i.e., at ), which could potentially imply fundamental instability and a network easily crossing between subcritical to supercritical states [30,61]. However, our optimal fit for the HFB power spectra implies that relevant cortical networks operate below this critical transition point (i.e., at
), where stability is still maintained but close enough to critical dynamics to allow the system to computationally benefit from near-criticality effects.
Temporally structured drive can push cortical circuits towards criticality. Cultured networks, for instance, shift toward avalanche-like dynamics when exposed to patterned inputs [94]. Recent whole-brain studies likewise report brief approaches to the critical boundary during executive control and creative ideation [95,96]. Our model captures these observations with a single control knob: a modest gain increase (G ≈ 0.94 → 0.96) that modulates the system from a stable, reverberating sub-critical regime towards the critical transition point, where noise-driven excursions enable rapid, metastable switching. This mechanism provides a plausible neural substrate for the explore–exploit cycle implicated in creative cognition. Importantly, the operating point remains sub-critical, consistent with the “edge-of-criticality” framework proposed by [97], while avoiding the sampling artefacts that may falsely suggest full transition into critical mode [98].
Wider implications of the CSD phenomenon should be noted. For example, it is well known that sleep deprivation can lead to over excitability and even to epileptic seizures. Indeed, it is proposed that sleep deprivation influences the proximity to criticality [99,100]. It is also known that sleep deprivation slows down reaction times and increases the presence of slow frequencies [28,99,101]. These phenomena may all be related. It may be that extensive recall tasks, which bring the system closer to criticality, will not only manifest in CSD but even trigger seizures in those who are highly susceptible.
A key hypothesis of our study is that cortical activity is modulated closer to a critical point in order to shift to a more generative mode, which is required under a wide range of conditions, such as creative and associative thinking [13,15,102]. The prediction is that behaviors requiring a generative mode would lead to slowing down of the dynamics compared to resting state. Note that this may potentially manifest in other signatures of criticality, such as measures related to edge-of-chaos dynamics and neuronal avalanches.
Finally, it is worth noting that the relationship between the LFP and firing rates is more appropriately captured by measuring the changes in the exponent of the LFP power-law PSD, as observed in our previous intra-cranial analysis of visual responses [103]. While these issues could potentially be examined in our recurrent network simulations, they have not been pursued in this work.
Overall, the findings from our simulations suggest that the ultra-slow spontaneous fluctuations are an emergent network phenomenon and support the idea that the cortex operates near a critical point, allowing for rapid and flexible control over spontaneous behaviors while maintaining stability. Furthermore, they provide a mechanistic explanation for the ability to flexibly target specific categories for free behavior. This work makes specific predictions and opens new avenues for investigating the role of criticality in brain function and could potentially lead to the development of new computational models for understanding the neural basis of cognition and behavior.
Methods
Random recurrent network as a rate model
Rate models can be derived from the well-studied 102 [104]. They were explored within the context of iEEG data in recent literature [30,61]. In previous work [57], we have shown that such rate models, describing the neuronal firing rates, can be faithfully derived from large-scale conductance based networks resembling cortical organization. The simplicity of rate models makes them highly useful in studying properties of large networks.
Here, we considered a random recurrent network model consisting of N rate-based units representing single neurons or groups of neurons (see Figs 2-6). We closely follow the model proposed in [30] and adapted its implementation by ([105]; see: github.com/gyyang/KimYang2017_Cell) to explain the power spectrum of iEEG recordings. The rate dynamics of the j-th unit over time in this model is given by:
where rj is the j-th unit activity level, and τ is a characteristic integration time. The [+] brackets denote a rectified linear (ReLU) activation function. The γ variable denotes the slope of the firing rate – current curve (f-I curve). The weights describe the strength of the connection from the presynaptic unit k to the postsynaptic unit j, and
denotes the external input to the j-th unit (drawn here randomly from a “standard normal” distribution).
The dynamical variables, rj, can be interpreted as representing synaptic variables, which effectively low-pass filter the neuronal firing rates [57].
Criticality in the random recurrent network
We consider a network architecture comprising a non-hierarchical cluster of recurrently connected neurons characterized by a semi-linear f-I curve with gain γ. Following [30], we choose the connections in this network (the entries of matrix W) to be sparse and random. More specifically, each entry is nonzero with probability p, and the nonzero entries are drawn from a normal distribution, with positive mean μconn and variance σconn, normalized by the size of the network, N.
When set as a rate model (assumed to be represented by the iEEG high-frequency broadband power modulation), we hypothesize it depicts the summed recurrent synaptic current into a fraction of neurons in this network, along with an intrinsic complementary membrane noise or some form of external noise.
Previous work [30] demonstrated how the product of three main parameters controls the proximity to the network’s critical point. These are: ,
, and
. At the tipping point of criticality, they satisfy the relation
, see [30]. Thus, we define a control parameter
, and require that it lies in the vicinity of the critical point - G
1. In a near-critical state, either one of these arguments composing the control parameter could be varied to yield similar results towards the crossing point into a supercritical state.
To provide additional insight in the present context, we note that near a steady-state, an algebraic linearized approximation of the spontaneous fluctuations dynamics can be obtained. We briefly summarize the details given in [30,58]:
- The eigenvalue spectrum of such random matrices is composed of a cluster of complex eigenvalues with negative real part, and a single isolated real eigenvalue, which corresponds to the uniform eigenvector (namely, to the mean network activity).
- The matrix that controls the linearized dynamics is a random matrix whose dominant eigenvalue is given by
. As the overall connection strength
approaches 1, the network’s dominant feedback loop weakens its damping; activity therefore decays more and more slowly, until at
it no longer decays at all and it experiences a transition from stable to unstable dynamics.
Importantly, the underlying analysis relies on random matrix theory and consequently assumes that the network has a large scale [59,106], specifically, a minimum of a few hundred units (see Appendix B, Fig A in S1 File). This scale is fundamental because random matrix theory draws on statistical properties that accurately reflect the system’s behavior only when applied to large matrices. Adequate network size is thus vital for capturing the delicate dynamics and stability features essential for close analysis of proximity to criticality.
Critical slowing down is manifested in the system’s temporal dynamics, as they are linked to the inverse of the abovementioned dominant eigenvalue (see SI for additional detail). As the system approaches criticality, signals reverberate in the network for increasingly longer time, producing a slowly decaying autocorrelation function and increasing the power of low frequencies.
For the sake of convenience, the following parameters were kept fixed across all simulation runs presented in the paper: μconn = 49.881 pA/Hz, σconn = 4.988 pA/Hz, p = 0.2.
Unless otherwise noted, the parameter that was used to control the proximity to criticality was the slope of the f-I curve, (for
, we obtain
). We did however verify that qualitatively equivalent results can be reached by modifying G using other parameters. The mean and std of
, the noisy input, was set to
The characteristic integration time and PSD profile’s pattern
As for the characteristic integration time τ, we explored values in the range of 2 – 200 ms. Using a simple linearized mathematical approximation (see Appendix A; Appendix B, Fig C in S1 File and [30]) it can be shown that the commonly observed “knee”-bend in in the power spectral density (PSD) profile appears near , and that modulation of slow-frequencies is also significantly influenced by this the integration time constant, implying that the fit tuning process should carefully consider its setting.
Our findings here indicate that using the previously reported timescale of the membrane time constant in the proximity of τ = 20 ms [61–63] led to an optimal fit regarding the location of this knee in the activity rate PSD curve as well as the related slow frequencies slope. That is, for both of the cases explored in [7,13], as demonstrated in Figs 2 and 3 respectively.
The sampling parameter α, defined as the fraction (i.e., 0 < α < 1) of network nodes selected for summation, plays a significant role in determining the slope of medium range frequencies (> 0.1 Hz and < 10 Hz) in the PSD curve, since it influences the mixture of slow and fast dynamical modes exposed in the system. To best adjust the fit in the iEEG PSD mid-to high frequencies (> 1 Hz), α was set between 0.01 <= α <= 0.1 [30]. When constructing the PSD, we sum the activities of αN random nodes and compute the spectrum of the summed activity. The larger the value of α, the larger the contribution of the dominant slow mode to the mid-range PSD slope, since this mode is shared among the network nodes.
Simulation and parameter fitting efforts
Simulation code, including full implementation of the rate model and plotting of figures, was developed in Python, taking advantage of suitable open-sourced libraries’ methods (including, but not limited to, SciPy and Matplotlib), facilitating tuning efforts for fitting the empirical results (details below).
To determine a stationary signal (and the period to be omitted before signal stabilization), the ‘Augmented Dickey-Fuller’ method was iteratively applied, using the Python statsmodel library’s adfuller method, extending the delay, until the remaining segment’s stability reached p-value < 0.01. For the results depicted in Fig 2, the signal was low-pass filtered (for < 0.1 Hz) as in [7], using a Butterworth method. Correlation, auto- and cross-correlation coefficients were computed using time-lagged correlation methods. In panel 2d, the activity of two arbitrarily selected highly-correlated neurons is presented. In panel 2e, the power spectrum was computed using Matplotlib mlab’s PSD, implementing the Welch method.
A simple amplitude scaling factor k was applied to match the Welch method results with those presented by direct use of FFT power in [7]. A regression line curve was fitted using a curve-fitting method in the range of 0.1 – 10 Hz. Following [7], the autocorrelation was computed using signals from the first sampling only (resembling a single cluster of neurons in the network), and cross-correlation was computed iteratively between unit pairs of both samples. The ratio of the power law exponents of these cross- and auto-correlation functions at low frequencies. Finally, the power spectrum in panel 2e was obtained by applying the PSD method to the mean of these autocorrelation (black line) and cross-correlation (orange line) curves, which is then plotted in log scale.
For Fig 3, the simulation was arranged as consecutive “rest” vs “free-recall” 1800 s blocks. Additive noise was defined as constant value within each simulation run, so that
. For the purpose of sampling a satisfactory distribution for statistical analysis, the simulation was repeated using different random seed values. The multiple realizations generated were independently compared with the empirical data. Other than that, similar simulation and computational methods were applied as in Fig 2.
Testing for various possible confounds, the simulation configuration for the results presented in Figs 4 – 6 was suitably adapted, but remained similar in concept to the settings in Fig 3. In data preparation for Fig 5, white noise was sampled in advance, applying suitable filters (using scipy signal.butter methods) over the random (white noise) normal distribution based temporal signal. For the map presented in Fig 6, we iterated through the parameter space of network connectivity and size, iteratively generating random network realizations of larger and larger scale. Thus, as the control parameter G was modified (via γ) from subcritical to near-critical range, values of the slow (normalized by fast) frequency power modulation were collected.
Goodness-of-fit and degrees-of-freedom correction
To quantify how closely the model reproduced the empirical PSD we computed LMSE, as the coefficient of determination in log-power space, .
Adjacent frequency bins are not independent because Welch segmentation and tapering introduce spectral leakage. We therefore replaced the nominal sample size by an effective sample size
[107] obtained
from the residual series :
where is the lag-k autocorrelation of
; summation stops once
. The corrected
replaced
, yielding conservative, assumption-compatible p-values that account for spectral redundancy.
The human iEEG data and its analysis
The empirical data used in this study has been previously reported in [7,13]. In both studies, intracranial recordings were obtained from patients with pharmacologically resistant epilepsy who were undergoing a surgical evaluation process for the treatment of their condition. In the first study [7], data were recorded from five participants (over multiple sessions) under the conditions of wakeful rest. During the wakeful rest, subjects’ eyes were closed and the room was dark and quiet, however, alertness was occasionally monitored. The second study [13] included twelve participants who performed a free recall task, as follows. Immediately after a 200 s period of closed-eyes resting state, participants were presented with 14 different pictures of famous faces and popular landmarks in a pseudorandom order (each picture was repeated four times, and shown for 1500 ms, with 750 ms inter-stimulus interval). After viewing the pictures, participants put on a blindfold (or closed their eyes) and began a short interference task of counting back from 150 in steps of 5 for 40 s. Upon completion, recall instructions were presented, guiding the patients to recall items from only one category at a time, starting with faces in one run and with places in a second run. Participants were asked to verbally describe each picture they recall, as soon as it comes to mind, with 2–3 prominent visual features. The duration of this free recall phase was 2.5 min per each category. In both experiments analyses of iEEG data were focused on changes in high-frequency broadband amplitude (HFB, also known as high-gamma, defined as the range 40–100 Hz in the first study and 60–160 Hz in the latter one), shown to be a reliable marker of local neuronal population activity [66].
Supporting information
S1 File.
Fig A. Relationship between network size and the effective control parameter (), illustrating the variability in proximity to criticality for small networks. The real part of the largest eigenvalue in the linearized matrix representation determines the network gain. To achieve critical dynamics, the effective gain should be 1. The figure shows the distribution of the effective gain as a function of network size when the parameters are set to yield a theoretical value of 1. Quantiles represent 100 random realizations of the network connectivity for each distinct network size. The analysis clearly demonstrates how the variance diminishes with increasing network size. Note (in the inset) that when the network size exceeds 200 units, the effective gain is very close to 1. For small networks, the network can be effectively sub- or super-critical, even when theoretically set to be near-critical. These results stem from the fact that the relevant analytical findings regarding random matrices hold only in the limit of very large networks. Fig B. Different settings of τ in fitting the PSD curve knee frequency. Parameter fitting of [7] empirical iEEG HFB signal results, using an artificial neural network model with 400 neurons, explored the optimal setting for τ. The autocorrelation (black line) and cross-correlation (orange line) PSD curve data, from two highly correlated neurons in the artificial network, are presented in panels a - c for different values of τ, 2, 20 and 200 ms, respectively. As indicated by arrows pointing to the knee position within the empirical autocorrelation (black) and cross-correlation (orange) PSD curve data, only panel b (reflecting τ = 20 ms) demonstrates an accurate match. Fig C. A Lorentzian fitting of free recall vs. rest HFB PSD curves. The power spectrum of the Lorentzian function is near-flat at low-frequencies and shows
scaling at high frequencies, with a transition point set by the time-constant of the exponential. The fitting process maximized correlation between respective Lorentzian and HFB PSD curves. The different fitting for the sub- and near-critical curve fits lies in the setting of fslow which was slightly adapted. Fig D. Slow-fluctuations power increase in iEEG and in the recurrent network simulation. Single realization simulation results from the fitting of Norman et al 2018 free-recall vs. rest PSD curves are demonstrated using additive noise and gain control. In both panels, the iEEG power spectra obtained from empirically recorded free-recall target sites (red), relative to the non-targeted population (blue), are compared with respective simulated results. In the left panel, additive noise is used to modulate slow fluctuation power increase whereas the F-I curve gain parameter, γ, is modulated for this purpose in the right panel. Note that Panel 4c in the main text extends the fit shown here by demonstrating the influence of higher levels of additive noise on the amplification in power of the slow fluctuations. Fig E. The influence of additive noise on the ultra-slow frequency power. Simulation results from our recurrent random network model far below the critical point (left panel) and at its very near vicinity (right panel) indicate that the significant amplification of the ultra-slow frequency power due to increase in additive noise is dependent on the proximity to criticality. In both panels, the x-axis indicates additive noise and y-axis denotes the normalized slow-frequency power amplification. Results are based on 10 random realizations for each level of additive noise. Normalization is two-staged, first obtaining the ratio of slow over fast frequency power per each realization, and then dividing the obtained value by the 0-additive-noise mean. Statistical analysis indicates a significant effect only for the case of near critical simulation results (R2 = 0.862, p < 10-20). Fig F. Impact of different input noise spectral profiles on the PSD of a zero-connectivity network. Panels a-b demonstrate a simulated zero-connectivity network PSD when the injected noise is high-pass filtered (panel a) relative to the initial unfiltered white noise (panel b). The cutoff frequency was 2 Hz for the high-pass filter. As in Fig 5 of the main text, the scale-free spectral profile in the mid-to-high frequency range is qualitatively preserved.
https://doi.org/10.1371/journal.pcbi.1013528.s001
(DOCX)
References
- 1. Raichle ME. The restless brain. Brain Connect. 2011;1(1):3–12. pmid:22432951
- 2. Allen EA, Damaraju E, Plis SM, Erhardt EB, Eichele T, Calhoun VD. Tracking whole-brain connectivity dynamics in the resting state. Cereb Cortex. 2014;24(3):663–76. pmid:23146964
- 3. Fox MD, Raichle ME. Spontaneous fluctuations in brain activity observed with functional magnetic resonance imaging. Nat Rev Neurosci. 2007;8(9):700–11. pmid:17704812
- 4. Smith SM, Nichols TE, Vidaurre D, Winkler AM, Behrens TEJ, Glasser MF, et al. A positive-negative mode of population covariation links brain connectivity, demographics and behavior. Nat Neurosci. 2015;18(11):1565–7. pmid:26414616
- 5. He BJ. Scale-free brain activity: past, present, and future. Trends Cogn Sci. 2014;18(9):480–7. pmid:24788139
- 6. Miller KJ, Sorensen LB, Ojemann JG, den Nijs M. Power-law scaling in the brain surface electric potential. PLoS Comput Biol. 2009;5(12):e1000609. pmid:20019800
- 7. Nir Y, Mukamel R, Dinstein I, Privman E, Harel M, Fisch L, et al. Interhemispheric correlations of slow spontaneous neuronal fluctuations revealed in human sensory cortex. Nat Neurosci. 2008;11(9):1100–8. pmid:19160509
- 8. Buzsáki G. Rhythms of the Brain. Oxford University Press. 2006.
- 9. Nir Y, Hasson U, Levy I, Yeshurun Y, Malach R. Widespread functional connectivity and fMRI fluctuations in human visual cortex in the absence of visual stimulation. Neuroimage. 2006;30(4):1313–24. pmid:16413791
- 10. Yellin D, Berkovich-Ohana A, Malach R. Coupling between pupil fluctuations and resting-state fMRI uncovers a slow build-up of antagonistic responses in the human cortex. Neuroimage. 2015;106:414–27. pmid:25463449
- 11. Ramot M, Wilf M, Goldberg H, Weiss T, Deouell LY, Malach R. Coupling between spontaneous (resting state) fMRI fluctuations and human oculo-motor activity. Neuroimage. 2011;58(1):213–25. pmid:21703354
- 12. Moutard C, Dehaene S, Malach R. Spontaneous Fluctuations and Non-linear Ignitions: Two Dynamic Faces of Cortical Recurrent Loops. Neuron. 2015;88(1):194–206. pmid:26447581
- 13. Norman Y, Yeagle EM, Harel M, Mehta AD, Malach R. Neuronal baseline shifts underlying boundary setting during free recall. Nat Commun. 2017;8(1):1301. pmid:29101322
- 14. Broday-Dvir R, Malach R. Resting-State Fluctuations Underlie Free and Creative Verbal Behaviors in the Human Brain. Cereb Cortex. 2021;31(1):213–32. pmid:32935840
- 15. Malach R. The neuronal basis of human creativity. Front Hum Neurosci. 2024;18:1367922. pmid:38476979
- 16. Harish O, Hansel D. Asynchronous Rate Chaos in Spiking Neuronal Circuits. PLoS Comput Biol. 2015;11(7):e1004266. pmid:26230679
- 17. Pozzorini C, Naud R, Mensi S, Gerstner W. Temporal whitening by power-law adaptation in neocortical neurons. Nat Neurosci. 2013;16(7):942–8. pmid:23749146
- 18. Cramer B, Kreft M, Billaudelle S, Karasenko V, Leibfried A, Müller E, et al. Autocorrelations from emergent bistability in homeostatic spiking neural networks on neuromorphic hardware. Phys Rev Research. 2023;5(3).
- 19. Litwin-Kumar A, Doiron B. Slow dynamics and high variability in balanced cortical networks with clustered connections. Nat Neurosci. 2012;15(11):1498–505. pmid:23001062
- 20. Ostojic S. Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons. Nat Neurosci. 2014;17(4):594–600. pmid:24561997
- 21. Wilting J, Priesemann V. Inferring collective dynamical states from widely unobserved systems. Nat Commun. 2018;9(1):2325. pmid:29899335
- 22. Shriki O, Yellin D. Optimal Information Representation and Criticality in an Adaptive Sensory Recurrent Neuronal Network. PLoS Comput Biol. 2016;12(2):e1004698. pmid:26882372
- 23.
Beggs JM. The cortex and the critical point: Understanding the power of emergence. The MIT Press. 2022.
- 24. Beggs JM, Plenz D. Neuronal avalanches in neocortical circuits. J Neurosci. 2003;23(35):11167–77. pmid:14657176
- 25. Kinouchi O, Copelli M. Optimal dynamical range of excitable networks at criticality. Nature Phys. 2006;2(5):348–51.
- 26. Shew WL, Yang H, Petermann T, Roy R, Plenz D. Neuronal avalanches imply maximum dynamic range in cortical networks at criticality. J Neurosci. 2009;29(49):15595–600. pmid:20007483
- 27. Lee K-E, Lopes MA, Mendes JFF, Goltsev AV. Critical phenomena and noise-induced phase transitions in neuronal networks. Phys Rev E Stat Nonlin Soft Matter Phys. 2014;89(1):012701. pmid:24580251
- 28. Meisel C, Klaus A, Kuehn C, Plenz D. Critical slowing down governs the transition to neuron spiking. PLoS Comput Biol. 2015;11(2):e1004097. pmid:25706912
- 29. Scheffer M, Bascompte J, Brock WA, Brovkin V, Carpenter SR, Dakos V, et al. Early-warning signals for critical transitions. Nature. 2009;461(7260):53–9. pmid:19727193
- 30. Chaudhuri R, He BJ, Wang X-J. Random Recurrent Networks Near Criticality Capture the Broadband Power Distribution of Human ECoG Dynamics. Cereb Cortex. 2018;28(10):3610–22. pmid:29040412
- 31. Deco G, Jirsa VK. Ongoing cortical activity at rest: criticality, multistability, and ghost attractors. J Neurosci. 2012;32(10):3366–75. pmid:22399758
- 32. Arviv O, Goldstein A, Shriki O. Near-Critical Dynamics in Stimulus-Evoked Activity of the Human Brain and Its Relation to Spontaneous Resting-State Activity. J Neurosci. 2015;35(41):13927–42. pmid:26468194
- 33. Haimovici A, Tagliazucchi E, Balenzuela P, Chialvo DR. Brain organization into resting state networks emerges at criticality on a model of the human connectome. Phys Rev Lett. 2013;110(17):178101. pmid:23679783
- 34. Linkenkaer-Hansen K, Nikouline VV, Palva JM, Ilmoniemi RJ. Long-range temporal correlations and scaling behavior in human brain oscillations. J Neurosci. 2001;21(4):1370–7. pmid:11160408
- 35. Shew WL, Plenz D. The functional benefits of criticality in the cortex. Neuroscientist. 2013;19(1):88–100. pmid:22627091
- 36. Shriki O, Alstott J, Carver F, Holroyd T, Henson RNA, Smith ML, et al. Neuronal avalanches in the resting MEG of the human brain. J Neurosci. 2013;33(16):7079–90. pmid:23595765
- 37. Yu S, Yang H, Shriki O, Plenz D. Universal organization of resting brain activity at the thermodynamic critical point. Front Syst Neurosci. 2013;7:42. pmid:23986660
- 38. Lombardi F, Shriki O, Herrmann HJ, de Arcangelis L. Long-range temporal correlations in the broadband resting state activity of the human brain revealed by neuronal avalanches. Neurocomputing. 2021;461:657–66.
- 39. Palva JM, Zhigalov A, Hirvonen J, Korhonen O, Linkenkaer-Hansen K, Palva S. Neuronal long-range temporal correlations and avalanche dynamics are correlated with behavioral scaling laws. Proc Natl Acad Sci U S A. 2013;110(9):3585–90. pmid:23401536
- 40. Tagliazucchi E, Balenzuela P, Fraiman D, Chialvo DR. Criticality in large-scale brain FMRI dynamics unveiled by a novel point process analysis. Front Physiol. 2012;3:15. pmid:22347863
- 41.
Harris TE. The theory of branching processes. Springer-Verlag. 2012.
- 42. Lombardi F, Pepić S, Shriki O, Tkačik G, De Martino D. Statistical modeling of adaptive neural networks explains co-existence of avalanches and oscillations in resting human brain. Nat Comput Sci. 2023;3(3):254–63. pmid:38177880
- 43. Bertschinger N, Natschläger T. Real-time computation at the edge of chaos in recurrent neural networks. Neural Comput. 2004;16(7):1413–36. pmid:15165396
- 44. Legenstein R, Maass W. Edge of chaos and prediction of computational performance for neural circuit models. Neural Netw. 2007;20(3):323–34. pmid:17517489
- 45. Toker D, Pappas I, Lendner JD, Frohlich J, Mateos DM, Muthukumaraswamy S, et al. Consciousness is supported by near-critical slow cortical electrodynamics. Proc Natl Acad Sci U S A. 2022;119(7):e2024455119. pmid:35145021
- 46. Bressloff PC, Cowan JD. Spontaneous pattern formation in primary visual cortex. Nonlinear Dynamics and Chaos: Where Do We Go From Here?. IOP Publishing Ltd.
- 47. Chossat P, Faugeras O. Hyperbolic planforms in relation to visual edges and textures perception. PLoS Comput Biol. 2009;5(12):e1000625. pmid:20046839
- 48. Pinto DJ, Ermentrout GB. Spatially Structured Activity in Synaptically Coupled Neuronal Networks: II. Lateral Inhibition and Standing Pulses. SIAM J Appl Math. 2001;62(1):226–43.
- 49. Cramer B, Stöckel D, Kreft M, Wibral M, Schemmel J, Meier K, et al. Control of criticality and computation in spiking neuromorphic networks with plasticity. Nat Commun. 2020;11(1):2853. pmid:32503982
- 50. Liang J, Zhou C. Criticality enhances the multilevel reliability of stimulus responses in cortical neural networks. PLoS Comput Biol. 2022;18(1):e1009848. pmid:35100254
- 51. Safavi S, Chalk M, Logothetis NK, Levina A. Signatures of criticality in efficient coding networks. Proc Natl Acad Sci U S A. 2024;121(41):e2302730121. pmid:39352933
- 52. Arviv O, Medvedovsky M, Sheintuch L, Goldstein A, Shriki O. Deviations from Critical Dynamics in Interictal Epileptiform Activity. J Neurosci. 2016;36(48):12276–92. pmid:27903734
- 53. Dotan A, Shriki O. Tinnitus-like “hallucinations” elicited by sensory deprivation in an entropy maximization recurrent neural network. PLoS Comput Biol. 2021;17(12):e1008664. pmid:34879061
- 54. Fekete T, Omer DB, O’Hashi K, Grinvald A, van Leeuwen C, Shriki O. Critical dynamics, anesthesia and information integration: Lessons from multi-scale criticality analysis of voltage imaging data. Neuroimage. 2018;183:919–33. pmid:30120988
- 55. Fekete T, Hinrichs H, Sitt JD, Heinze H-J, Shriki O. Multiscale criticality measures as general-purpose gauges of proper brain function. Sci Rep. 2021;11(1):14441. pmid:34262121
- 56. O’Byrne J, Jerbi K. How critical is brain criticality?. Trends Neurosci. 2022;45(11):820–37. pmid:36096888
- 57. Shriki O, Hansel D, Sompolinsky H. Rate models for conductance-based cortical neuronal networks. Neural Comput. 2003;15(8):1809–41. pmid:14511514
- 58. Ganguli S, Bisley JW, Roitman JD, Shadlen MN, Goldberg ME, Miller KD. One-dimensional dynamics of attention and decision making in LIP. Neuron. 2008;58(1):15–25. pmid:18400159
- 59. Rajan K, Abbott LF. Eigenvalue spectra of random matrices for neural networks. Phys Rev Lett. 2006;97(18):188104. pmid:17155583
- 60. Vázquez-Rodríguez B, Avena-Koenigsberger A, Sporns O, Griffa A, Hagmann P, Larralde H. Stochastic resonance at criticality in a network model of the human cortex. Sci Rep. 2017;7(1):13020. pmid:29026142
- 61. Gao R, van den Brink RL, Pfeffer T, Voytek B. Neuronal timescales are functionally dynamic and shaped by cortical microarchitecture. Elife. 2020;9:e61277. pmid:33226336
- 62. Koch C, Rapp M, Segev I. A brief history of time (constants). Cereb Cortex. 1996;6(2):93–101. pmid:8670642
- 63. Softky WR, Koch C. The highly irregular firing of cortical cells is inconsistent with temporal integration of random EPSPs. J Neurosci. 1993;13(1):334–50. pmid:8423479
- 64. Manning JR, Jacobs J, Fried I, Kahana MJ. Broadband shifts in local field potential power spectra are correlated with single-neuron spiking in humans. J Neurosci. 2009;29(43):13613–20. pmid:19864573
- 65. Mukamel R, Gelbard H, Arieli A, Hasson U, Fried I, Malach R. Coupling between neuronal firing, field potentials, and FMRI in human auditory cortex. Science. 2005;309(5736):951–4. pmid:16081741
- 66. Nir Y, Fisch L, Mukamel R, Gelbard-Sagiv H, Arieli A, Fried I, et al. Coupling between neuronal firing rate, gamma LFP, and BOLD fMRI is related to interneuronal correlations. Curr Biol. 2007;17(15):1275–85. pmid:17686438
- 67. Dakos V, van Nes EH, Donangelo R, Fort H, Scheffer M. Spatial correlation as leading indicator of catastrophic shifts. Theor Ecol. 2009;3(3):163–74.
- 68. Fox MD, Corbetta M, Snyder AZ, Vincent JL, Raichle ME. Spontaneous neuronal activity distinguishes human dorsal and ventral attention systems. Proc Natl Acad Sci U S A. 2006;103(26):10046–51. pmid:16788060
- 69. Sadaghiani S, Kleinschmidt A. Functional interactions between intrinsic brain activity and behavior. Neuroimage. 2013;80:379–86. pmid:23643921
- 70. Strappini F, Wilf M, Karp O, Goldberg H, Harel M, Furman-Haran E, et al. Resting-State Activity in High-Order Visual Areas as a Window into Natural Human Brain Activations. Cereb Cortex. 2019;29(9):3618–35. pmid:30395164
- 71. Eldar E, Cohen JD, Niv Y. The effects of neural gain on attention and learning. Nat Neurosci. 2013;16(8):1146–53. pmid:23770566
- 72. Hahamy A, Wilf M, Rosin B, Behrmann M, Malach R. How do the blind “see”? The role of spontaneous brain activity in self-generated perception. Brain. 2021;144(1):340–53. pmid:33367630
- 73. Harmelech T, Malach R. Neurocognitive biases and the patterns of spontaneous correlations in the human cortex. Trends Cogn Sci. 2013;17(12):606–15. pmid:24182697
- 74. Norman Y, Malach R. What can iEEG inform us about mechanisms of spontaneous behavior?. Center for Open Science. 2022.
- 75. Raut RV, Snyder AZ, Mitra A, Yellin D, Fujii N, Malach R, et al. Global waves synchronize the brain’s functional systems with fluctuating arousal. Sci Adv. 2021;7(30):eabf2709. pmid:34290088
- 76. Zeraati R, Shi Y-L, Steinmetz NA, Gieselmann MA, Thiele A, Moore T, et al. Intrinsic timescales in the visual cortex change with selective attention and reflect spatial connectivity. Nat Commun. 2023;14(1):1858. pmid:37012299
- 77. Gavenas J, Rutishauser U, Schurger A, Maoz U. Slow ramping emerges from spontaneous fluctuations in spiking neural networks. Nat Commun. 2024;15(1):7285. pmid:39179554
- 78. Ghosh A, Rho Y, McIntosh AR, Kötter R, Jirsa VK. Noise during rest enables the exploration of the brain’s dynamic repertoire. PLoS Comput Biol. 2008;4(10):e1000196. pmid:18846206
- 79. Fousek J, Rabuffo G, Gudibanda K, Sheheitli H, Petkoski S, Jirsa V. Symmetry breaking organizes the brain’s resting state manifold. Sci Rep. 2024;14(1):31970. pmid:39738729
- 80. Hellyer PJ, Jachs B, Clopath C, Leech R. Local inhibitory plasticity tunes macroscopic brain dynamics and allows the emergence of functional brain networks. Neuroimage. 2016;124(Pt A):85–95. pmid:26348562
- 81. Rusakov DA, Savtchenko LP, Latham PE. Noisy Synaptic Conductance: Bug or a Feature?. Trends Neurosci. 2020;43(6):363–72. pmid:32459990
- 82. Yarom Y, Hounsgaard J. Voltage fluctuations in neurons: signal or noise?. Physiol Rev. 2011;91(3):917–29. pmid:21742791
- 83. Lampl I, Reichova I, Ferster D. Synchronous membrane potential fluctuations in neurons of the cat visual cortex. Neuron. 1999;22(2):361–74. pmid:10069341
- 84. Liu X, Guang J, Glowinsky S, Abadi H, Arkadir D, Linetsky E, et al. Subthalamic nucleus input-output dynamics are correlated with Parkinson’s burden and treatment efficacy. NPJ Parkinsons Dis. 2024;10(1):117. pmid:38879564
- 85. Einevoll GT, Kayser C, Logothetis NK, Panzeri S. Modelling and analysis of local field potentials for studying the function of cortical circuits. Nat Rev Neurosci. 2013;14(11):770–85. pmid:24135696
- 86. Harmelech T, Preminger S, Wertman E, Malach R. The day-after effect: long term, Hebbian-like restructuring of resting-state fMRI patterns induced by a single epoch of cortical activation. J Neurosci. 2013;33(22):9488–97. pmid:23719815
- 87. Gal A, Eytan D, Wallach A, Sandler M, Schiller J, Marom S. Dynamics of excitability over extended timescales in cultured cortical neurons. J Neurosci. 2010;30(48):16332–42. pmid:21123579
- 88. Gal A, Marom S. Entrainment of the intrinsic dynamics of single isolated neurons by natural-like input. J Neurosci. 2013;33(18):7912–8. pmid:23637182
- 89. Sukenik N, Vinogradov O, Weinreb E, Segal M, Levina A, Moses E. Neuronal circuits overcome imbalance in excitation and inhibition by adjusting connection numbers. Proc Natl Acad Sci U S A. 2021;118(12):e2018459118. pmid:33723048
- 90. Gal A, Marom S. Self-organized criticality in single-neuron excitability. Phys Rev E. 2013;88(6).
- 91. Fisch L, Privman E, Ramot M, Harel M, Nir Y, Kipervasser S, et al. Neural “ignition”: enhanced activation linked to perceptual awareness in human ventral stream visual cortex. Neuron. 2009;64(4):562–74. pmid:19945397
- 92. Kim M, Kim H, Huang Z, Mashour GA, Jordan D, Ilg R, et al. Criticality Creates a Functional Platform for Network Transitions Between Internal and External Processing Modes in the Human Brain. Front Syst Neurosci. 2021;15:657809. pmid:34899199
- 93. E. Privman, L. Fisch, MY Neufeld, U. Kramer, S. Kipervasser, F. Andelman, et al. Antagonistic relationship between gamma power and visual evoked potentials revealed in human visual cortex. Cereb Cortex. 2012;21(3):616–24. https://doi.org/10.1093/cercor/bhq128
- 94. Habibollahi F, Kagan BJ, Burkitt AN, French C. Critical dynamics arise during structured information presentation within embodied in vitro neuronal networks. Nat Commun. 2023;14(1):5287. pmid:37648737
- 95. Chen Q, Kenett YN, Cui Z, Takeuchi H, Fink A, Benedek M, et al. Dynamic switching between brain networks predicts creative ability. Commun Biol. 2025;8(1):54. pmid:39809882
- 96. Müller EJ, Munn BR, Shine JM. The brain that controls itself. Current Opinion in Behavioral Sciences. 2025;63:101499.
- 97. Wilting J, Priesemann V. 25 years of criticality in neuroscience - established results, open controversies, novel concepts. Curr Opin Neurobiol. 2019;58:105–11. pmid:31546053
- 98. Neto JP, Spitzner FP, Priesemann V. Sampling effects and measurement overlap can bias the inference of neuronal avalanches. PLoS Comput Biol. 2022;18(11):e1010678. pmid:36445932
- 99. Harel A, Levkovsky A, Nakdimon IC, Gordon BC, Shriki O. EEG-Based Prediction of Reaction Time during Sleep Deprivation. Sleep. 2025.
- 100. Meisel C, Olbrich E, Shriki O, Achermann P. Fading signatures of critical brain dynamics during sustained wakefulness in humans. J Neurosci. 2013;33(44):17363–72. pmid:24174669
- 101. Nir Y, Andrillon T, Marmelshtein A, Suthana N, Cirelli C, Tononi G, et al. Selective neuronal lapses precede human cognitive lapses following sleep deprivation. Nat Med. 2017;23(12):1474–80. pmid:29106402
- 102. Pezzulo G, Zorzi M, Corbetta M. The secret life of predictive brains: what’s spontaneous activity for?. Trends Cogn Sci. 2021;25(9):730–43. pmid:34144895
- 103. E. Podvalny, N. Noy, M. Harel, S. Bickel, G. Chechik, CE Schroeder, et al. A unifying principle underlying the extracellular field potential spectral responses in the human cortex. J Neurophysiol. 2015;114(1): 505–19. https://doi.org/10.1152/jn.00943.2014
- 104. Wilson HR, Cowan JD. Excitatory and inhibitory interactions in localized populations of model neurons. Biophys J. 1972;12(1):1–24. pmid:4332108
- 105. Kim Y, Yang GR, Pradhan K, Venkataraju KU, Bota M, García Del Molino LC, et al. Brain-wide Maps Reveal Stereotyped Cell-Type-Based Cortical Architecture and Subcortical Sexual Dimorphism. Cell. 2017;171(2):456-469.e22. pmid:28985566
- 106. Tao T. Outliers in the spectrum of iid matrices with bounded rank perturbations. Probab Theory Relat Fields. 2011;155(1–2):231–63.
- 107. Afyouni S, Smith SM, Nichols TE. Effective degrees of freedom of the Pearson’s correlation coefficient under autocorrelation. NeuroImage. 2019;199:609–25.