Skip to main content
Advertisement
  • Loading metrics

Firing rate homeostasis counteracts changes in stability of recurrent neural networks caused by synapse loss in Alzheimer’s disease

  • Claudia Bachmann ,

    Roles Conceptualization, Formal analysis, Investigation, Methodology, Project administration, Visualization, Writing – original draft, Writing – review & editing

    c.bachmann@fz-juelich.de

    Affiliation Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany

  • Tom Tetzlaff,

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany

  • Renato Duarte,

    Roles Conceptualization, Writing – original draft, Writing – review & editing

    Affiliation Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany

  • Abigail Morrison

    Roles Conceptualization, Funding acquisition, Supervision, Writing – original draft, Writing – review & editing

    Affiliations Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA BRAIN Institute I, Jülich Research Centre, Jülich, Germany, Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr-University Bochum, Bochum, Germany

Abstract

The impairment of cognitive function in Alzheimer’s disease is clearly correlated to synapse loss. However, the mechanisms underlying this correlation are only poorly understood. Here, we investigate how the loss of excitatory synapses in sparsely connected random networks of spiking excitatory and inhibitory neurons alters their dynamical characteristics. Beyond the effects on the activity statistics, we find that the loss of excitatory synapses on excitatory neurons reduces the network’s sensitivity to small perturbations. This decrease in sensitivity can be considered as an indication of a reduction of computational capacity. A full recovery of the network’s dynamical characteristics and sensitivity can be achieved by firing rate homeostasis, here implemented by an up-scaling of the remaining excitatory-excitatory synapses. Mean-field analysis reveals that the stability of the linearised network dynamics is, in good approximation, uniquely determined by the firing rate, and thereby explains why firing rate homeostasis preserves not only the firing rate but also the network’s sensitivity to small perturbations.

Author summary

Relating the properties of neuronal circuits with concrete functional roles pertaining to cognition and behavior is a complex endeavour. This is especially true when it comes to diseases and dysfunctions, where we are often left with high-level clinical observations (e.g. cognitive deficits and structural brain changes), without understanding their relationship. A potentially fruitful approach to address this problem consists of employing simplified mathematical models to test the relevant hypotheses, incorporating pathophysiological observations and evaluating their tentative functional consequences, in relation to clinical observations. In this work, we employ a spiking neural network model to study the effects of synaptic loss, as it is often observed in various neurodegenerative disorders, in particular Alzheimer’s disease (AD). We show that the loss of synapses drives the network into a less sensitive regime, which potentially accounts for the cognitive deficits of AD. We also endow the circuits with a compensation mechanism which restores the mean network activity by increasing the weight of the remaining connections. We demonstrate that this very simple compensatory mechanism can prevent the network from drifting into the less sensitive regime and recover all other dynamical features that are changed due to synapse loss. We further develop an analytical model that accounts for this surprising finding.

Introduction

Accelerated synapse loss is a prominent feature in many types of neurodegenerative disorders, such as Huntington’s disease, frontotemporal dementia or Alzheimer’s disease [15]. In Alzheimer’s disease (AD), synapse loss appears to be particularly important, as it is widespread across different brain areas and constitutes a key marker in the AD pathology (see, e.g., [5]). The mechanisms underlying AD related synaptic modifications are currently the subject of intensive research, which has revealed that a number of different alterations at the molecular level may ultimately lead to synaptic decay [68], such as an abnormal occurrence of oligomeric and aggregated β-amyloid-peptides (Aβ), an abnormal phosphorylation of the tau protein and the occurence of neurofibrillary tangles, and a disrupted signaling in neuroinflammatory and oxidative stress responses [811].

Previous studies have uncovered a strong positive correlation between cognitive impairment in AD patients and synapse loss [1219]. In contrast, correlations between the cognitive status and the density of plaques or tangles have frequently been reported as rather weak. Synapse loss is therefore not merely a structural epiphenomenon of AD, but appears to be the physical correlate of cognitive decline. While the most commonly reported early symptom of AD is memory deterioration, the disease is associated with a wide range of other cognitive problems such as stereotyped, repetitive linguistic production, visuo-spatial deficits and disorientation, apraxia, and loss of executive functions, i.e. planning and abstract reasoning [20, 21]. The observed progression of cognitive symptoms goes hand in hand with brain tissue atrophy [2224] associated with loss of synapses [25], suggesting that the synaptic degeneration may underlie the cognitive deterioration following the gradual involvement of different, functionally specialized brain regions.

It is known that AD-related molecular and cellular alterations, such as abnormal depositions of Aβ plaques or atrophy rates, often significantly precede cognitive symptoms (see, e.g., [26, 27], and references therein). However, mechanisms exist that counteract synapse loss [28, 29], at least in the early stages of the disease. Various studies have shown that the loss of synapses is accompanied by a growth of remaining synapses, such that the total synaptic contact area (TSCA) per unit volume of brain tissue is approximately preserved [12, 17, 18, 30]. It is likely that such compensatory mechanisms underlie the observed delay in the onset of cognitive symptoms with respect to the onset of symptoms at the cellular level [31]. The heterogeneity in the disease progression and the propensity to transition from healthy cognitive aging to mild cognitive impairment and dementia may thus be associated to a subject’s ability to counteract synapse loss and, to a certain extent, maintain global functionality in a way that masks the progressive underlying pathophysiology. Such homeostatic, regulatory mechanisms appear to play an important role in counteracting structural deterioration and preserving computational capabilities. On the other hand, they pose important challenges to the network’s functionality since they have the potential to disrupt the specificities of a circuit’s microconnectivity (namely the distribution of synaptic strengths) and thus degrade its information content (e.g. [32]). Successful homeostatic compensation thus requires a balanced orchestration which preserves the system’s computational properties and macroscopic dynamics, e.g., average firing rates [33, 34] and E/I balance [35], as well as the relative ratios and distributions of synaptic strengths (e.g. synaptic scaling mechanisms; [36, 37]).

Understanding the circuit-level consequences of synaptic alterations, entailing both the deregulation by synapse loss and recovery through homeostasis, is essential to understand whether they represent a negative symptom of the disease or a compensatory response. One likely effect is the modification of the network’s firing rate. In order to maintain a physiological operating regime far from activity extremes (quiescence or epileptic activity), a network needs the capacity to regulate its firing rate. The degree to which this may be impaired in AD is still under debate (see [10, 38]). While the effects of synaptic alterations on the network dynamics have been partially characterized, a direct link between synapse loss, network dynamics and functional decline has yet to be systematically established, with only a few studies addressing the topic [3941]. However, this connection may prove fruitful, both for understanding the disease itself and for fostering the development of new diagnostic and therapeutic approaches. It is currently unknown to what extent homeostatic mechanisms, such as increasing the synaptic area [12, 18], can completely recover the neuronal network’s firing rate, nor whether the preservation of the firing rate by such mechanisms entails the preservation of cognitive performance.

In this study, we investigate the link between structure, dynamics and function using a recurrent spiking neural network model [42]. Despite their simplicity, such systems have been shown to support computations, such as e.g. stimulus categorization, associative learning and memory, information routing and propagation, etc. (see e.g., [4348]). Additionally, although these models have complex behavioral repertoires, they are often simple enough that their dynamics can be assessed analytically. The stability of the dynamics can then be related to computational task performance, such as the network’s sensitivity to perturbation and classification capability [49, 50]. Thus, an analytical treatment of network dynamics can provide insight into why some realizations of such networks perform better than others and how performance is affected by structural changes. Theoretical studies explicitly addressing this issue have so far focused either on the disruption of oscillations or functional connectivity of the whole brain, or on memory only (especially memory retrieval; [3941]).

Here, we investigate how the loss of excitatory-excitatory synapses in sparsely connected random networks of spiking excitatory and inhibitory neurons (Sec. Computational network model of Alzheimer’s disease) and firing rate homeostasis, based on upscaling the remaining excitatory-excitatory connections, alters the dynamical characteristics of a network. Surprisingly, we find that firing rate homeostasis can restore a variety of dynamic features caused by synaptic loss, including the increase in spike train regularity, the drop in the fluctuations of population activity and the reduction of the synaptic contact area (Sec. Total synaptic contact area and firing statistics) caused by synaptic loss. In addition, we observe that synaptic loss decreases the network’s sensitivity to small perturbations (Sec. Perturbation sensitivity and linear stability), such that a network operating near the ‘edge of chaos’ would be shifted by synaptic loss to a more stable regime; a shift which has been shown in previous studies to result in a decrease in computational capacity [4954], and may account for the cognitive deficits observed in Alzheimer’s disease. Here, too, firing rate homeostasis counteracts the shift towards the stable regime. We further show that these compensatory mechanisms ultimately become exhausted if physiological limits are placed on the growth of the synapse. As it is not obvious why simply maintaining the firing rate also maintains the stability of the network, we analyze the stability of the linearized network dynamics and discover a strictly monotonic relationship between the firing rate and the spectral radius of the network, which explains the restoration of the dynamics under the influence of firing rate homeostasis (Sec. Perturbation sensitivity and linear stability).

Results

Computational network model of Alzheimer’s disease

We study the effects of AD related synaptic alterations on the network dynamics and computational characteristics in the framework of a generic mathematical neuronal network model (Fig 1A), which captures prominent structural and dynamical features of local neocortical networks such as the relative numbers of excitatory and inhibitory neurons [55, 56] and synapses [57, 58], sparse connectivity [56, 59], small synaptic weights [60], irregular [6163] and predominantly asynchronous spiking [64], large membrane potential fluctuations [6568], and a tight dynamical balance between excitatory and inhibitory synaptic currents [69]. The network is composed of randomly and sparsely connected populations of excitatory (E) and inhibitory (I) integrate-and-fire neurons, driven by external spiking input. The overall coupling strength is determined by the reference synaptic weight J. For simplicity, all excitatory connections (EE and IE) and all inhibitory connections (EI and II), respectively, have equal synaptic weight: JEE = JIE = J and JEI = JII = −gJ in the intact network (i.e. before synapse loss). The relative strength g of inhibitory weights is chosen such that the network is dominated by inhibition, to permit asynchronous irregular firing at low rates [70]. A complete specification of the network model and parameters can be found in Sec. Network model and S1 and S2 Tables in the Supplementary Material. An illustration of the connectivity of the excitatory population for an intact network and an example spike train is given in Fig 1B.

thumbnail
Fig 1. Sketch of the network model of Alzheimer’s disease and homeostasis.

A) The network comprises two reciprocally and recurrently connected populations of excitatory (E) and inhibitory (I) integrate-and-fire neurons, excited by an external spiking input. Thickness of arrow indicates relative strength of the connection. In this study, Alzheimer’s disease is modeled by removing connections between excitatory neurons (loss of EE synapses) and upscaling of the remaining EE synapses to maintain the average firing rate (firing rate homeostasis). BE) Sketch of EE connection density (number of arrows in upper panels), connection strength (thickness of arrows in upper panels) and resulting single-neuron spiking activity (lower panels). B) Intact network (without synapse loss). C) Synapse loss without homeostasis: removal of EE synapses and resulting reduction in firing rate. D) Synapse loss with unlimited homeostasis: removal of EE synapses and increase in strength of remaining EE synapses to maintain the average firing rate. Synaptic weights are allowed to grow without bounds. E) Synapse loss with limited homeostasis: removal of EE synapses and bounded increase in strength of remaining EE synapses. Here, synaptic weights cannot exceed 120% of their reference weight. The firing rate is therefore only partially recovered. For a complete description and parameter specification of the network model, see, Sec. Network model and, S1 and S2 Tables in the Supplementary Material.

https://doi.org/10.1371/journal.pcbi.1007790.g001

We implement the effects of AD on the network connectivity by reducing the number of excitatory synapses on excitatory neurons (EE synapses; [7, 71]), whilst keeping the number of connections between other populations (EI, IE, II) constant. In the absence of any compensation mechanism, this modification leads to a reduction in the average firing rate (see, Sec. Total synaptic contact area and firing statistics).

In biological neuronal networks, long-term activity levels are often stabilized by homeostatic regulation [20, 72, 73]. While a maintenance of firing rates has been observed at the level of individual neurons [33], long-term recordings suggest a predominance of a network-wide regulation [34] targeting a constant population firing rate. Such a homeostatic stabilization of the population firing rate can be accounted for by a global adjustment of synaptic weights (synaptic scaling; [74, 75]). Indeed, in the early stages of AD, synapse loss seems to be compensated by a growth of the remaining synapses [12, 18, 30]. To realize this mechanism in our spiking neuronal network, we implement a firing-rate homeostasis which compensates for the loss of EE synapses by a global increase in the weights JEE of the remaining EE synapses, thereby preserving the population firing rate. (In order to demonstrate that our results also apply to other forms of homeostasis, we also implement a local synaptic scaling mechanism, in which the firing rate is regulated at the level of individual neurons.).

For advanced AD, where a large portion of the EE synapses has been lost, a full recovery of the population firing rate through synaptic scaling would require unrealistically large synaptic weights. During aging and dementia, the maximum increase in synaptic size has been reported to be in the range from 9% to 24% (see [17], and references therein). We incorporate these findings by introducing an optional upper bound for the weight JEE of EE synapses.

To uncover the differential effects of excitatory synapse loss and homeostasis, in this study we investigate the dynamical and computational characteristics of a network for three different scenarios: synapse loss without homeostatic compensation (Fig 1C), synapse loss with an unlimited firing rate homeostasis where synaptic weights can grow without bounds (Fig 1D), and synapse loss with limited firing rate homeostasis where the synaptic weights cannot exceed 120% of the weight in the intact reference network (Fig 1E).

Note that the model’s high level of abstraction enables us to identify fundamental mechanisms, to reduce the risk of overfitting, and to arrive at general conclusions that may be transferred to other brain regions or even different spatial scales. Empirically observed features of biological neural networks such as heavy-tail synaptic weight distributions [76, 77] or active dendritic processing [78] are not explicitly incorporated. As a consequence, model parameters such as synaptic weights have to be regarded as “effective” parameters and cannot be mapped to biological parameters in a one-to-one fashion. Selecting a particular set of parameters to be considered “biologically realistic” would be misleading. Therefore, rather than focusing on a specific configuration of the model, we systematically vary both the reference synaptic weight J and the extent of synapse loss to uncover the general relationship between these parameters and the dynamical and computational properties of the network.

Fig 2 demonstrates the main effects of varying these parameters. The firing rate of a network increases monotonically with the choice of reference synaptic weight J; for a network with a given J, the firing rate of the network decreases with the loss of EE synapses, with the majority of the rate reduction occurring in the range 0–20% (Fig 2A). Likewise, for a given degree of synaptic loss, the firing rate increases with the strength of the remaining EE synapses (Fig 2B). Throughout this work, we investigate the behaviour of the network across the two demensional parameter space spanned by the reference synaptic weight J and the extent of synapse loss. The results are typically visualized with the help of contour plots whose interpretation in the context of variations in firing rate is illustrated in Fig 2C, in order to facilitate their comprehension. Note that a reduction of synapses of 95% is of course not biologically plausible; we chose generous ranges for both parameters to give a better visulization of the behaviour of the system and to demonstrate the surprisingly strong effects of homeostasis.

thumbnail
Fig 2. Effect of different parameter configurations on the network’s firing rate.

The time and population-averaged firing rate is explored with respect to two parameters: the synaptic weight J and the loss of EE synapses. In A the EPSP amplitude is fixed (J ∈ {0.1, 1, 2, 3}mV, represented by a circle, square, triangle and diamond markers, respectively) and the firing rate ν is plotted against various degrees of synapse loss. In B the firing rate ν is plotted against the synaptic weight J for four different degrees of EE synapse loss (0%, 10%, 30%, 70%, with circle, square, triangle and diamond markers). The information of these two plot is combined in the contour plot C, which shows the dependence of the firing rate ν (colour coded) on synaptic reference weight J and the degree of synapse loss. Vertical rose lines correspond to the lines plotted in A, horizontal lines to the yellow lines in B. For visualization purposes, markers show only a subset of the data. All plotted data corresponds to the mean across 10 random realizations.

https://doi.org/10.1371/journal.pcbi.1007790.g002

Total synaptic contact area and firing statistics

In the absence of homeostatic compensation (left column of Fig 3), removal of excitatory synapses on excitatory neurons naturally results in a decrease in the population firing rate ν, irrespective of the synaptic-weight scale J (Figs 2 and 3A). An upscaling of the remaining EE synapses (middle column) allows us to preserve the population firing rate, even if substantial amounts of synapses are removed (vertical contours in Fig 3B). If the maximum synaptic weight is limited, firing rates are preserved only up to a critical level of synapse loss (early stages of AD; Fig 3C).

thumbnail
Fig 3. Effect of synapse loss and firing rate homeostasis on firing rate, synaptic weights and total synaptic contact area.

Dependence of the time and population averaged firing rate ν (AC), synaptic weight JEE (DF) and the relative total synaptic contact area (TSCA) of EE synapses (GI) on the reference weight J and the degree of EE synapse loss in the absence of homeostatic compensation (left column), as well as with unlimited (middle column) and limited firing rate homeostasis (right column). Color-coded data represent mean across 10 random network realizations. Symbols mark parameter configurations shown in Fig 5.

https://doi.org/10.1371/journal.pcbi.1007790.g003

Experimental studies have shown that, in early AD, the reduction in the number of synapses is accompanied by a growth of the remaining synapses such that the total synaptic contact area (TSCA) per unit volume is approximately preserved [12, 18, 30]. Our simple AD network model reproduces this finding if we define the TSCA as the product of the number of EE connections and the synaptic weight JEE (Sec. Synaptic contact area and characterization of network activity). Without homeostatic upscaling of EE weights, the TSCA is proportional to the number of EE connections and therefore quickly decreases with increasing levels of synapse loss (Fig 3G). In the presence of firing rate homeostasis, however, the TSCA remains largely constant unless a majority of synapses is lost (Fig 3H) or the maximum synaptic weight is reached (Fig 3I). We conclude that the experimentally observed stabilization of the TSCA in the face of synapse loss may be a consequence of a homeostatic synaptic scaling regulated by the average population firing rate.

In physiologically relevant low activity regimes, neuronal firing is determined both by the mean as well as by fluctuations in the synaptic input. A reduction in the number of synapses followed by an upscaling of synaptic weights may preserve the average population firing rate; it cannot, however, simultaneously preserve the mean and the variance of the synaptic input currents. The neurons’ working point, i.e. the statistics of the synaptic input, will inevitably change. A priori, it is therefore not clear to what extent synapse loss and firing rate homeostasis alter the overall firing statistics in the recurrent network beyond the average firing rate. Here, we address this question by studying the irregularity of spike generation by individual neurons, measured by the coefficient of variation CV of the inter-spike interval distribution, and spike-train synchrony, assessed by the normalized variance of the population spike count, the Fano factor FF, in 10ms time intervals (see, Sec. Synaptic contact area and characterization of network activity). Without homeostatic compensation, synapse loss generally results in spike patterns that are less irregular (Fig 4A) and less synchronous (Fig 4D). In the presence of firing rate homeostasis, however, both the CV and the FF are largely preserved (Fig 4B and 4E; light spot in E is due to a significant outlier in one simulation). Only if the level of synapse loss becomes too severe or if the synaptic-strength limits are reached (limited homeostasis), the CV and the FF are reduced (Fig 4B, 4C, 4E and 4F).

thumbnail
Fig 4. Effect of synapse loss and firing rate homeostasis on spike train statistics.

Dependence of the coefficient of variation CV of inter-spike intervals (AC) and the Fano factor FF of the population spike count (binsize b = 10ms; DF) on the synaptic reference weight J and the degree of EE synapse loss in the absence of homeostatic compensation (left column), as well as with unlimited (middle column) and limited firing rate homeostasis (right column). Color-coded data represent mean across 10 random network realizations. Symbols mark parameter configurations shown in Fig 5.

https://doi.org/10.1371/journal.pcbi.1007790.g004

For illustration, Fig 5 depicts the spiking activity for four example parameter settings marked by the symbols in Figs 3 and 4. As Fig 4E and 4H already suggest, the overall spiking activity, e.g the number and the duration of synchronous event and the spiking frequency of single neurons, of the homeostatic network (Fig 5C) and the reference network (Fig 5A) are very similar. Only the exact timing of the synchronous events and the single neuron spiking differ. In the AD network without homeostasis (Fig 5B), the firing rates of both excitatory and inhibitory neurons are decreased. The number of synchronous events, compared with the reference network (Fig 5A), does not seem to be decreased, but their duration does. The network with limited homeostasis (Fig 5D) is more similar to the AD network without homeostasis than the unlimited homeostasis networks, because the restriction in synaptic growth prevents the rate from being recovered.

thumbnail
Fig 5. Effect of synapse loss and firing rate homeostasis on spiking activity.

Spiking activity (dots mark time and sender of each spike) in an intact reference network (no synapse loss, JEE = 1.4mV; A), as well as in networks where 30% of the EE synapses are removed: B) no homeostasis (JEE = 1.4mV), C) unlimited homeostasis (JEE = 2.02mV), D) limited homeostasis (JEE = 1.68mV). In all panels, the synaptic-weight scale is set to J = 1.4mV. Examples depict parameter configurations marked by corresponding symbols in Figs 3 and 4 (cf. marker in lower right corner of each panel). The values for the parameters explored in these figures (synaptic weight J, EE synaptic weight JEE, total synaptic contact area TSCA, firing rate ν, coefficient of variation CV and Fano factor FF) are listed next to each plot. Regions below and above the gray horizontal line show spiking activity of a subset of 100 excitatory (E) and 25 inhibitory neurons (I), respectively.

https://doi.org/10.1371/journal.pcbi.1007790.g005

Perturbation sensitivity and linear stability

An open question in Alzheimer’s disease research is how cellular damage such as synapse loss affects patients’ cognitive capabilities. A number of theoretical studies have shown that recurrent neuronal networks exhibit optimal computational performance characteristics for a variety of task modalities if they operate in a dynamical regime where small perturbations are neither instantly forgotten nor lead to entirely different network states [4954]. In dynamical systems theory, this regime has been termed the “edge of chaos” as it represents the transition from a stable state with a low sensitivity to small perturbations to a chaotic state where the sensitivity to small perturbations is high. Here, we investigate the role of synapse loss and firing rate homeostasis for the network’s sensitivity to perturbations as an indicator of its overall computational performance.

To assess the perturbation sensitivity, we simulate a given network twice with identical initial conditions and identical realizations of external inputs. In the second run, we apply a small perturbation by delaying one of the external input spikes to a single neuron by a fraction of a millisecond (Fig 6). In stable regimes, the effect of this perturbation on the spiking response is transient and quickly vanishes (Fig 6A, top). In chaotic regimes, in contrast, the small perturbation leads to diverging spike patterns (Fig 6B, top). We quantify the network’s perturbation sensitivity S = 1 − |R| in terms of the long-term correlation coefficient R between the low-pass filtered spike responses in the two runs (Fig 6, bottom). With this definition, S = 0 and S = 1 correspond to insensitive (stable) and highly sensitive (chaotic) networks, respectively (for details, see, Sec. Synaptic contact area and characterization of network activity).

thumbnail
Fig 6. Perturbation sensitivity.

Top: Example spiking activity (dots mark time and sender of each spike) of two identical networks (identical neuron parameters, connectivity, external input, initial conditions) with (black dots) and without perturbation (purple dots). The perturbation consists in delaying one external input spike at time t* = 400ms by δt* = 0.5ms. The vertical red line marks the time of the perturbation. Spikes of only 10% of all neurons are shown. Neurons below and above the horizontal gray line correspond to excitatory and inhibitory neurons, respectively. Bottom row: Perturbation sensitivity S(t) = 1 − |R(t)| obtained from the correlation coefficient R(t) of the low-pass filtered spike trains generated by the unperturbed and the perturbed network (black and purple dots in top panels; see, Sec. Synaptic contact area and characterization of network activity). A) Stable dynamics (J = 0.45mV, KEE = 100). B) Chaotic dynamics (J = 1.75mV, KEE = 100).

https://doi.org/10.1371/journal.pcbi.1007790.g006

For small synaptic weights J, the network dynamics is always stable (S = 0) for our choice of parameters, irrespective of the degree of synapse loss and the absence or presence of homeostatic compensation (Fig 7). In this regime, the perturbation has no long-term effect: after a transient phase, the response spike patterns in the perturbed and the unperturbed simulation are exactly identical (at the temporal resolution Δtf = 1ms of the recorded signals). The intact networks (zero synapse loss) enter a chaotic regime (S > 0) if the synaptic weights J exceed a certain critical value. Removal of EE synapses without homeostatic compensation leads to a shift of this transition towards larger synaptic weights (Fig 7A). Networks in the chaotic regime eventually become insensitive to perturbations with progressing EE synapse loss. In the presence of firing rate homeostasis, in contrast, the perturbation sensitivity is preserved (color gradient in Fig 7B is predominantly left to right, rather than top to bottom). Unless the homeostatic strengthening of EE synapses is limited (limited homeostasis; Fig 7C and S3 Fig in the Supplementary Material), this maintenance of the perturbation sensitivity is observed even if the degree of synapse loss is substantial (> 80%).

thumbnail
Fig 7. Effect of synapse loss and firing rate homeostasis on perturbation sensitivity.

Dependence of perturbation sensitivity S on the synaptic reference weight J and the degree of EE synapse loss in the absence of homeostatic compensation (A), as well as for unlimited (B) and limited firing rate homeostasis (C). Color-coded data represent mean across 10 random network realizations. Superimposed black and gray curves mark regions where the linearized network dynamics is stable (gray dashed; spectral radius ρ = …, 0.6, 0.8), about to become unstable (black; ρ = 1), and unstable (gray solid; ρ = 1.2, 1.4, …). Pink symbols mark parameter configurations shown in Figs 3 and 5.

https://doi.org/10.1371/journal.pcbi.1007790.g007

We conclude that synapse loss, as observed in Alzheimer’s disease, tends to reduce the perturbation sensitivity of the affected networks, and may thereby impair their computational performance for a broad range of task modalities. Homeostatic mechanisms that preserve the average network activity (firing rate) can prevent this reduction in sensitivity and, hence, the decline in computational capability.

So far, the reported results on the perturbation sensitivity were obtained by network simulations for a specific set of parameters. In the following, we employ an analytical approach; firstly, to show that our findings are general and do not depend on the details of the network model, and secondly, to shed light on the mechanisms underlying the reduction in perturbation sensitivity by synapse loss and its maintenance by firing rate homeostasis.

As shown in [79], the dynamics of large random networks of analog nonlinear neurons without (or with constant) external input undergoes a transition from a stable to a chaotic regime at some critical synaptic coupling strength. The study further revealed that this transition coincides with a critical point where the local linearized network dynamics becomes unstable. For more realistic networks of spiking neurons, networks with fluctuating external input or networks with a more realistic connectivity structure, a strict correspondence between the onset of chaotic dynamics and linear instability could not be established [54, 8084]. Nevertheless, various previous studies suggest that the two transition types are interrelated, in the sense that a change in the linear stability characteristics is accompanied by a change in the network’s sensitivity to small perturbations.

Here, we propose that the linear stability characteristics can serve as an indirect and easily accessible indicator of the network’s sensitivity to small perturbations, and hence its computational capability. As described in Sec. Linearized network dynamics and stability analysis, the linearized network dynamics is determined by the effective connectivity matrix W. Its components wij = Wij (i, j ∈ {1, …, N}) measure the effect of a small fluctuation in the firing rate νj(t) of a presynaptic neuron j on the rate νi(t) of the postsynaptic neuron i at a specific working point determined by the stationary firing rates ν = (ν1, …, νN). The effective connection weights are hence determined not only by the synaptic weights Jij, but also by the excitability of the target cell i, which is in turn determined by the statistics of the synaptic input fluctuations, i.e. the dynamical state of the local network. The linearized dynamics becomes unstable if the spectral radius ρ = Re(λmax), the real part of the maximal eigenvalue λmax of W, exceeds unity.

Loss of EE synapses corresponds to setting a fraction of the excitatory components wij () to zero. In the absence of homeostatic compensation, we expect this weakening of positive feedback to have a stabilizing effect. The dependence of the effective weights wij on the working point, however, leads to a non-trivial effect of synapse loss and firing rate homeostasis on the spectral radius ρ. Here, we compute ρ by employing the diffusion approximation of the leaky integrate-and-fire neuron and random-matrix theory (for details, see, Sec. Linearized network dynamics and stability analysis).

As shown in Fig 7 (black and gray curves), the linear stability characteristics (as measured by the spectral radius ρ) bear striking similarities to the sensitivity to perturbations. In the absence of homeostasis, loss of EE synapses leads to a fast decrease in ρ. Linearly unstable networks quickly become stable (Fig 7A). Firing rate homeostasis, in contrast, preserves the spectral radius ρ, even if a substantial fraction of EE synapses is removed. Linearly unstable networks remain unstable (Fig 7B), until the homeostatic resources are exhausted (Fig 7C).

The analytical approach described in Sec. Linearized network dynamics and stability analysis provides us with an intuitive understanding of why and under what conditions firing rate homeostasis preserves the linear stability characteristics in the face of synapse loss. The analysis shows that, in the presence of firing-rate homeostasis, the spectral radius ρ is uniquely determined by the stationary average firing rate (red points in Fig 8A and Eq 18). For the parameters chosen in this study, an approximately unique dependence on the firing rate is also observed in the absence of homeostasis and for limited homeostasis (blue and yellow points in Fig 8A). Network simulations reveal similar findings for the perturbation sensitivity S (Fig 8B). For unlimited homeostasis, the firing rate, the perturbation sensitivity and the spectral radius remain (approximately) constant during synapse loss (red points in Fig 8). In the absence of homeostasis or for limited homeostasis, firing rates change; however, the relationship between firing rate and spectral radii ρ and perturbation sensitivities S nevertheless remains constant (red, blue and yellow points in Fig 8 lie over one another). The number KEE and the strength JEE of EE synapses therefore play only an indirect role by determining the stationary firing rate ν. Any combination of KEE and JEE that preserves ν will simultaneously preserve ρ (and S).

thumbnail
Fig 8. Firing rate as predictor of linear stability and perturbation sensitivity.

Dependence of the linear stability quantified by the spectral radius ρ (A; theory) and the perturbation sensitivity S (B; simulation results) on the mean stationary firing rate νE of the excitatory neuron population in the absence of homeostasis (blue), as well as for unlimited (red) and limited firing rate homeostasis (yellow). Scatter plots depict data for various reference weights J ∈ {0, …, 3}mV and various degrees of synapse loss from 0% to 50%. Same data as in Figs 7, 12D–12F and 12M–12O.

https://doi.org/10.1371/journal.pcbi.1007790.g008

The unique dependence of the spectral radius ρ on the firing rate ν is a consequence of the working-point dependence of the effective weights wijη(νi)Jij/σi, where η(νi) is a function of the firing rate νi of the target neuron i (see Eq 16). To maintain the stationary firing rate νE of excitatory neurons, the synaptic weights JEE are increased to compensate for the loss of excitatory synapses, i.e. for the decrease in the number KEE of excitatory inputs. This increase in the synaptic weights Jij (for neurons i, j both in the excitatory population) is accompanied by an increase in the variance of the synaptic input received by the target neuron i. If the response firing rate νi is kept constant (as is the case in the presence of firing rate homeostasis), an increase in σi leads to a decrease in neuron i’s sensitivity to a modulation of the input current caused by a spike of the source neuron j. This interplay between an upscaling of the weights Jij and a downscaling of the neuron’s modulation sensitivity restricts the growth in the effective weight wij, and, ultimately, leads to a preservation of the spectral radius ρ. In Sec. Linearized network dynamics and stability analysis, we demonstrate this effect for a homogeneous network of leaky-integrate-and-fire neurons. The derivation relies on the assumption that the synaptic weights are sufficiently small and the rate of synaptic events is high (diffusion approximation), that the stationary firing rates νE and νI of excitatory and inhibitory neurons are identical (homogeneity), and that the input fluctuations caused by external sources are small compared to those generated by the local network.

The interplay of hypo- and hyperactivity and its effect on E/I balance and perturbation sensitivity

Despite the broad scope of pathological changes observed in AD, we have only considered the effect of synapse loss in our AD model so far. The reason for this simplification is the strikingly high correlation between synapse loss and cognitive decline in AD [1219], which suggests it plays a particularly prominent role in AD’s pathophysiology. We have shown that the loss of EE synapses leads to a decreased firing rate (hypoactivity, see, Sec. Total synaptic contact area and firing statistics). Our theoretical analysis shows that this effect also generalises to the unspecific loss of synapses (see, S2 Fig in Supplementary Material). Another, seemingly contradictory observation, is the occurrence of hyperactive episodes (increased network activity, epileptic discharges) that predominantly take place at the initial stages of the disease [8591]. It has been argued that this hyperactivity is one of the main disease triggers, being responsible for a broad range of subsequent pathologic alterations, such as changes in synaptic receptor expression, synapse loss and neuronal degeneration (for review, see [92]). Accordingly, hyperactivity, which coincides with a shift of the E/I balance towards excitation [93], has been studied intensively and several potential mechanistic causes have been suggested. For example, amyloid beta has been shown to increase glutamate release [94]. Also the loss of particular connection, e.g. from excitatory to inhibitory neurons [95] or from inhibitory to excitatory neurons [90, 96], might contribute to an increased overall network connectivity.

The manifold and sometimes contradictory empirical observations make it impossible to create a computational AD model that is in conformity with all findings. Therefore, we choose to focus on the most established and uncontroversial phenomena. First, hyperactivity is followed by hypoactivity with some hyperactive neurons/episodes emerging at later disease stages [85, 86, 92]. Second, increased synaptic volume [12, 17, 18, 30] correlates with increased postsynaptic potentials and characterize the initial stages of the disease [30, 92]. Third, synapse loss has been observed in all disease stages and appears to be the best correlate of cognitive decline [1219, 97].

These well-documented observations, could be accounted for by two possible scenarios. In the first scenario, AD-specific changes in the brain primarily lead to synapse loss and hypoactivity, which is compensated for by increased synapse growth and other compensatory mechanisms. These homeostatic mechanisms might be insufficiently regulated, resulting in episodes of hyperactivity. As the disease progresses, the resources that compensate for synapse loss become exhausted and hypoactivity prevails. In the second potential scenario, AD triggers alterations that cause hyperactivity such as the growth of EE synapses or the weakening of particular connections that promote inhibition (presumably synapses from inhibitory to excitatory neurons). A compensatory reaction of the system then reduces the number of EE synapses in order to increase the impact of inhibition (or decrease the total excitation). This compensation may overshoot, resulting in a pathologic hypoactivity in the later stages of the disease.

So far, we have only considered the first scenario (EE synapse loss and EE synapse growth as a compensation mechanism) without modeling an overshooting compensation that leads to hyperactivity. Here, we examine the consequences of such a deregulated homeostasis and excessive increment of the EE synaptic weights in an AD network that already lost EE connections (KEE = 70). As described in Sec. Network model, this increase is not, as previously the case, limited by a reference firing rate. As Fig 9A shows, increasing the EE weights beyond the point in which the network reaches its reference firing rate (black line), easily leads to an explosion of the firing rates. This rise in firing rate goes hand-in-hand with an increased sensitivity (Fig 9C), which likewise exceeds the sensitivity of the reference network at similar points (black line).

As a comparison, we examine the effects of hyperactivity according to the second scenario, i.e. as the primary disease trigger and not as a consequence of an homeostasis overreaction, by increasing the weight of EE synapses in a fully connected network (KEE = 100). Our results show that even a comparably small increase to the strength of the EE synapses can increase the network’s firing rate drastically, once again causing the network to become more sensitive to small perturbations (Fig 9B and 9D).

thumbnail
Fig 9. Effect of EE synapse growth on firing rate and sensitivity.

Dependence of the firing rate (A, B) and the sensitivity (C, D) on the synaptic reference weight and the degree of EE synapse growth for an EE synapse loss of 30% (KEE = 70) (A, C) and no synapse loss (B, D). An EE synapse growth of zero means that all excitatory synapses have the same weight (JEE = JIE). For values larger than zero, synapses of excitatory neurons onto other excitatory neurons are stronger than onto inhibitory neurons. Black curve in A and C represent the firing rate and sensitivity of the reference network (no synapse loss, JEE = JIE). Color-coded data represents mean across 10 random network realizations.

https://doi.org/10.1371/journal.pcbi.1007790.g009

A possible homeostatic response to such an increased activity is a reduction in the number of EE synapses. To give numeric examples, in a network with a reference synaptic weight of J = 0.85mV subject to synapse loss of 30% which is overcompensated for by increasing the EE synapse weight by 50%, the firing rate raises form 1.2 spikes/s to about 2.6 spikes/s. Simultaneously, the sensitivity increases from 0.06 to 0.16 (data extracted from Fig 9A and 9C). A homeostatic reduction of EE synapses to regain the original firing rate of 1.2 spikes/s results in a network with a net loss of 38%, but with the reference sensitivity of 0.06.

If the same reference network (J = 0.85mV) increases its EE weight by 10% the firing rate raises from 1.2 spikes/s to 7.7 spikes/s accompanied by a shift in the sensitivity from 0.06 to 0.43 (data extracted from Fig 9B and 9D). If the network compensates for this with an 11% loss of synapses, both the firing rate and the sensitivity of the reference network are recovered. These examples demonstrates that the order of hyper/hypoactivity, synapse loss and growth is not important for the sensitivity of the network. The only relevant quantity is the firing rate to which the network converges.

In addition to the question of the mechanisms causing hyperactivity in AD, much consideration has been given to its effects. Hyperactivity has been put in the context of a disruption of the network’s E/I balance [93], which is assumed to make a major contribution to cognitive decline [98, 99]. This implies that an alteration in the E/I balance automatically implies a change in computational performance. To investigate this hypothesis, we examine the relationship of the sensitivity of the network to the E/I balance of the reference and hyperactive networks (estimated as described in Sec. Synaptic contact area and characterization of network activity). Fig 10A shows that networks exhibit a large range of sensitivities for a given E/I balance, in comparison to a rather narrow range for a given firing rate (Fig 10B). We thus conclude that firing rate is the primary indicator of the sensitivity of a network, rather than E/I balance.

thumbnail
Fig 10. Firing rate and not E/I balance as predictor of linear stability and perturbation sensitivity.

Dependence of perturbations sensitivity S on the E/I balance B (A) and on the mean stationary firing rate (B) for the limited homeostasis network (compare Fig 8, yellow), a network of full synaptic density (KEE = 100) but increased EE synapse growth (JEE ∈ {1 ⋅ JIE, …, 1.1 ⋅ JIE}; black) and for a network in which 30% EE synapse loss (KEE = 70) is compensated by EE synapse growth (JEE ∈ {1 ⋅ JIE, …, 1.5 ⋅ JIE}; red). Color-coded data represents mean across 10 random network realizations.).

https://doi.org/10.1371/journal.pcbi.1007790.g010

Effect of local homeostasis on perturbation sensitivity

The homeostatic regulation of the firing rate seems to take place on different spatial scales in the mammalian cortex: within each neuron or even within each dendrite [33, 100], and across a group of neurons in a network [34]. So far, we have only considered a global (network-wide) homeostatic regulation of the network’s firing rate in which the adjustment of a single model parameter (the EE weight) was sufficient to achieve the reference state. For the investigations described above, the necessary alteration to the parameter was determined in an offline fashion according to the following procedure: simulation of the reference network to establish the reference firing rate; deletion of synapses to create an AD network; increase of strength of remaining synapses of the AD network until the reference rate was reached; simulation of the AD network with the new weight configuration. Naturally, this scenario does not reflect what happens in the brain, in which homeostatic regulation occurs continuously.

In this section, we examine to what extent our main result (the sensitivity to perturbation as a unique function of the firing rate) is robust with respect to a local mechanism of firing rate homeostasis combined with a continuous weight update, as proposed by [101, 102]. As described in Sec. Network model, we first delete EE synapses and then simulate the network equipped with the local homeostatic mechanism for an additional 1200sec. During that time, each excitatory neuron attempts to reach its target firing rate by increasing incoming excitatory synapses if its rate is below the target rate, and deleting them if its rate is above the target rate. New synapses are created either with the same weights as the existing synapses in the reference network or, assuming a process of synaptic growth, with a very small weight. Shortly before the end of the simulation, we measure the firing rates of the neurons and sensitivity of the network as described in Sec. Network model.

Fig 11A shows that if the local homeostasis model succeeds in regaining the rate of the corresponding reference network, then the sensitivity of the reference network is also regained. Conversely, those networks that do not converge to the desired activity state do not exhibit the sensitivity of the reference network. Unsurprisingly, networks in which new synapses are very small are less likely to converge to the reference rate in the time allowed. Irrespective of whether the target rate is reached, the sensitivity of the network is determined by its firing rate, as demonstrated in Fig 11B. We thus conclude that our main finding is robust to the assumption of global or local homeostasis mechanisms.

thumbnail
Fig 11. The effect of local homeostasis on sensitivity.

Dependence of sensitivity to perturbations of networks with local homeostasis on different synaptic reference weight J ∈ {0.5mV, 1.2mV, 2.2mV} (A) and on the network’s population average firing rate ν. Red crosses: reference networks without synapse loss (KEE = 100); black dots: networks that underwent synapse loss but that regained the reference firing rate, assuming new synapses were created with a weight of Jlh = JEE; pink dots: as for black dots, but initial synapses created with a weight of Jlh = 6 ⋅ 10−4mV. (B) relationship between sensitivity and firing rate for all networks, regardless of whether the original firing rate was regained. Dots and circles as in (A); gray and pink triangles: as for black and pink dots, but for networks that could not restore the reference rate. All data points represent the mean across all realizations of a particular network weight J and a particular degree of synapse loss (0, 20, 40, 60, 80%).

https://doi.org/10.1371/journal.pcbi.1007790.g011

Discussion

In this article, we study the effect of Alzheimer’s disease on the dynamics and perturbation sensitivity of recurrent neuronal networks. To this end, we employ a computational model of a generic neuronal network composed of excitatory and inhibitory spiking neurons. Alzheimer’s disease is implemented, to a first approximation, in the form of a loss of excitatory synapses onto excitatory neurons. The resulting decrease in the firing rate is avoided (or delayed) by firing rate homeostasis, which is achieved by increasing the weights of the remaining excitatory-excitatory (EE) synapses. In one scenario, we allow synaptic weights to grow without bounds; in another, to ensure that they stay within the physiological range [18], we limit the maximum synaptic weight during homeostasis to 120% of the reference weight in the intact network (i.e. before synapse loss). We show that, in the absence of homeostatic compensation, a progressive loss of EE synapses not only reduces the average firing rate, but also leads to an increase in spike train regularity and a decrease in the fluctuations of the population activity.

This reduction in firing rate appears to be at odds with empirical observations, which demonstrate that network activity is enhanced in the areas affected earlier (e.g. hippocampus) [87, 103, 104]. For the observations of hyperactive states dominating the initial stages of the diseases and hypoactive brain activity combined with excessive synapse loss in the later stages, two possible scenarios are discussed in literature [18, 92]. In the first scenario, hyperactivity occurs in the first place and is compensated by activity-reducing strategies, e.g. synapse loss. These compensation mechanisms overshoot leading to hypoactivity in later disease stages. The physiological observation supporting this hypothesis is that oligomeric Amyloid-beta (Aβ) aggregates primarily boost glutamate release and change its uptake [92]. On the long run, glutamate spillover and the potential of Aβ oligomers to enhance the occurrence of phosphorylated tau in spines (for review, see [8]), causes degradation of synaptic connections [105]. This process may justify the decreased activity, as predicted by our model, during later stages of the disease, when the tau-pathology becomes more prominent (see, e.g., [86, 91, 106, 107]). In the second scenario, hypoactivity primarily caused by synapse loss is compensated by changes that increase activity. Dysregulated compensatory mechanisms would then lead to hyperactive states. In the course of the disease, compensatory mechanisms are stretched to their limits and cannot compensate for the excessive synapse loss anymore. Both scenarios have in common that: a) if homeostatic mechanism are optimal, such that the physiological firing rate is maintained, the resulting network configurations feature stronger but fewer EE synapses; b) synapse loss and hyperactivity dominates at later disease stages.

Because of these two commonalities and the possibility to draw analogies between the two different models, we focus our analysis only on the second scenario, in which synapse loss happens first. According to our AD model, the decrease in firing rate in more advanced disease stages can be delayed by homeostatic synaptic scaling. Moreover, our model predicts that, as long as the homeostatic mechanisms are able to restore the network’s firing rate, the CV and Fano factor are also preserved. Once these mechanisms are exhausted in the later disease stages, our model predicts that the spike train regularity increases and the fluctuations in the population activity decrease. Such phenomena (weakening of synaptic coupling decreasing the CV) have also been found in other computational studies [80, 108]. However, an experimental investigation on the evolution of activity statistics in the brains of AD animal models is, to our knowledge, yet to be performed.

In addition to the effects on the activity statistics, we demonstrate that the loss of synapses results in a reduction of the network’s sensitivity to small perturbations, which goes hand-in-hand with an increase in linear stability. In the presence of unlimited firing rate homeostasis, the perturbation sensitivity, as well as all other dynamical network characteristics, are preserved, even if the extent of synapse loss is substantial. In addition to the dynamical features, the total synaptic contact area, which is decreased in the AD network due to synapse loss, is largely retained. If the homeostatic synapse growth is limited, the network dynamics as well as the total synaptic area are preserved as long as the firing rate can be maintained. Beyond this point, the network quickly approaches the state of the pathological AD network without homeostasis. The effectiveness of homeostatic compensation investigated in this study provides a possible explanation for why morphological disease-related changes in the brain (e.g. synapse loss) precede any clinically recognizable cognitive deficits by years or even decades [31]. The fact that homeostasis is able to recover all network characteristics is non-trivial because in the homeostatic network with few but strong EE synapses, the statistics of the synaptic input (mean and variance) is altered with respect to the intact reference network with many weak EE synapses.

In order to investigate this observation further, we analyze the linear stability characteristics of the network and find a unique dependency of the network’s spectral radius on the firing rate under unlimited homeostasis. Previous theoretical studies have shown that simple recurrent neuronal networks exhibit optimal computational performance for a variety of tasks if they operate in a regime where small perturbations are neither amplified nor instantly forgotten, i.e. close to the edge of chaos [4954]. Here, we regard the network’s sensitivity to a small perturbation as an indicator of its computational performance in a broad sense. Assuming that a healthy network acts close to the edge of chaos, our results suggest that the EE synapse loss observed in AD moves the dynamics of the network away from that point towards a less sensitive regime with stable dynamics.

This key prediction of our study can be tested experimentally in animal models by analyzing time series of recorded neuronal activity. The degree of chaoticity can be revealed by the application of metrics such as the power spectrum, autocorrelation function, fractal dimension, Lyapunov exponents (for review, see [108110]), and the analysis of neuronal avalanches [108, 110, 111]. Our prediction of such experiments would be that the degree of chaoticity only depends on the network’s firing rate regardless of the exact synapse configuration or EI balance (compare Sec. The interplay of hypo- and hyperactivity and its effect on E/I balance and perturbation sensitivity).

Whereas our analysis accounts for why the sensitivity to perturbation recovers under unlimited homeostasis, it is notable that the coefficient of variation and the Fano factor of the spike trains are also preserved, suggesting a relationship between the transition from the stable to the chaotic regime and these two network activity characterizations. It has previously been proposed that the transition in spiking neuronal networks from the homogeneous asynchronous state (small sensitivity to perturbation and small CV) to the heterogeneous asynchronous state (high sensitivity to perturbation and high CV) is equivalent to the point where analogous rate networks become chaotic [80, 112]. Such a relationship would also explain our observation that the maintenance of the stability of the linearized network dynamics coincidences with the maintenance of the CV.

Our results raise the question of why a shift towards more stable dynamics would be disadvantageous for the system. From networks that exhibit binary or rate dynamics we know that they are insensitive to perturbation in the input and prone to fading memory (changes in the external input are quickly forgotten, see, e.g., [113, 114]). For spiking networks, such as the one investigated in this study, it has been shown that chaotic dynamics, by allowing the system to be more flexible in responding to new inputs, are beneficial in periodic pattern generation tasks and liquid state computing [45, 115, 116]. In [115], for example, a recurrent network’s internal weights are fixed and initialized to ensure the network operates in a chaotic regime, while synapses feeding back from an external readout are trained with a supervised learning algorithm (FORCE learning). This study demonstrates the benefits of chaotic dynamics for pattern generation tasks. Additionally, in the absence of output feedback (the classical reservoir computing paradigm), whereby the recurrent connections are fixed and only the weights of the connections from the recurrent network to the readout units are trained, it has been shown that stable dynamics impairs computational performance in simple classification tasks [45, 117].

On the other hand, insensitivity to small perturbations makes the system less susceptible to disruption by noise and is a prerequisite for the formation of stable attractors, which have been frequently used as a memory storage mechanism in neuronal networks (e.g., [118]). However, more recent recordings in prefrontal and association cortices reveal that single cells exhibit complex and variable dynamics with respect to stimulus representation [119], which neither supports the hypothesis of stable attractors nor points to a network dynamics in the stable regime. Computational studies that have investigated the memory capacity whilst taking heterogeneous neural dynamics into account have found that memory formation succeeds in a chaotic regime [120, 121] or with an embedding of stable subspaces in chaotic dynamics [122]. In addition, the construction of associative memory based on unstable periodic orbits of chaotic attractors has been suggested as a possible way of increasing memory capacity [123]. Thus, stable dynamics appear to be at odds with experiments and might even prove disadvantageous for memory formation. Additionally, concrete links between the stability of network dynamics (particularly the drift towards more sensitive regimes, such as those emerging from gradual synapse loss) and processing capabilities under complex, cognitively-relevant computations are lacking and ought to be established in the future.

On the cognitive level, these results suggest that, as homeostatic compensation mechanisms begin to fail, the shift of dynamics towards the stable regime would cause a decrease in performance within a variety of domains. For example, deficits in memory, known to primarily affect recent experiences of the AD individual, could be accounted for by the hypothesis that chaotic dynamics are needed to form new attractors [121]. In addition, very stable dynamics hinder the transition from one attractor to another, which might explain the difficulties of AD patients to perform task switching and dual task processing [124, 125]. Finally, the observation that AD patients often show repetitive speech and actions [126] might be explained by difficulties in moving away from the corresponding attractor state.

So far, only a few other studies on this abstraction level exist that investigate the relationship of the physical symptoms of Alzheimer’s disease to its cognitive deficits. With respect to memory, the effect of synapse loss and compensation through maintaining the TSCA has been investigated in an associative memory model [3941]. In accordance with our results, the impairment of memory retrieval due to (excitatory) synapse loss was shown to be successfully compensated by restoring the TSCA, if the restoration occurs sufficiently quickly. The effect of the restoration on the firing rate was not explicitly shown. Although these studies demonstrated that homeostasis via synaptic up-regulation can retain memory performance, they lack a systematic investigation of different network parameters and do not provide an analytical explanation for the results.

The dynamical and computational consequences of intrinsic and synaptic-scaling based homeostatic processes have been investigated in previous studies [32, 127]. Consistent with our findings on the effect of firing-rate homeostasis in the presence of synapse loss, [127] showed that homeostatic intrinsic plasticity helps maintaining a given (chaotic) working point in the presence of external perturbations (constant external inputs). It prevents recurrent neuronal networks from drifting into a regular (non-chaotic) regime and thereby improves input separability. Similarly, [32] demonstrated that homeostatic synaptic scaling can compensate for a partial deafferentation (loss of external inputs) and maintain the macroscopic network dynamics, provided the degree of deafferentation does not exceed a certain critical level. Above this critical level, their model networks develop into a state dominated by slow oscillations, dense activity and bursting, similar to the effects observed in several CNS disorders. A direct comparison with our study is difficult as the networks in [32] are small (100 neurons in total, each excitatory neuron projecting to 5 excitatory and 2 inhibitory cells), the connectivity is distance-dependent, and neurons are described by two-compartment conductance-based models. Moreover, [32] investigated the effect of changes in the external input, whereas we focus on a loss of recurrent connectivity. In our study, we show that homeostatic synaptic scaling preserves firing rates, linear stability, sensitivity to small perturbations, as well as the degree of firing irregularity (CV) and synchrony (Fano factor). We do not observe any low-frequency oscillatory behaviour. It remains a task for future studies to investigate whether the effects observed in [32] generalize to the type of network studied here.

We complement our numerical results by an analytical approach to gain an intuitive understanding of the mechanisms underlying the recovery of the perturbation sensitivity (and hence, computational performance) by firing rate homeostasis. To study the linear stability characteristics of the network, we apply mean-field theory, similar to the approach used by [80]. Note that we do not claim that a loss of linear stability coincides with the transition from stable to chaotic dynamics [54, 8184], as observed in large autonomous random networks of analog neurons [79]. Rather, we exploit that the linear stability characteristics follow a similar trend as the perturbation sensitivity. Assessing the linear stability characteristics relies on the knowledge of the effective connection strengths, i.e. the number of excess response spikes evoked by an additional input spike in the presence of synaptic background activity. This effective connectivity can be obtained experimentally (see, e.g., [128, 129]), or, for a specific neuron and synapse model, numerically (see, e.g., [130132]). For simplified models, such as the leaky integrate-and-fire neuron studied here, it can be calculated analytically under simplifying assumptions (diffusion approximation; [133, 134]). However, we note that the preservation of linear stability by firing-rate homeostasis is due to the approximately exponential shape of the gain function. It remains to be investigated whether our results can be generalized to other types of neurons with different gain functions. Our theoretical analysis exposes the working-point dependence of the effective weights as the essential mechanism underlying the recovery of linear stability by firing rate homeostasis: on the one hand, the upscaling of EE synaptic weights required for maintaining the firing rates contributes to a destabilization of the network dynamics. On the other hand, the increase in synaptic weights leads to an increase in the variance of the synaptic-input fluctuations, which, in turn, reduces the neurons’ susceptibility to modulations in the presynaptic input, and therefore stabilizes network dynamics. Note that a similar effect has been described in [135].

Both our theoretical approach and our numerical simulations are predicated on random network connectivity. This randomness, however, neglects basic structures of brain areas that undergo severe pathologic changes in the course of AD progression, e.g. hippocampus, prefrontal cortex and cerebellum. These regions are all organized into layers, and have local connectivity structure comprising special features such as distant-dependence [136] or clustered synaptic connectivity [137]. The precise impact of such structural aspect is not yet clear. From the study of [50], we can conclude that, for networks with distant-dependent connectivity, the systematic increase of synaptic weights makes the network more sensitive, as we also observed in this work. However, a more systematic investigation of different network connectivity structures is needed to determine whether our results are robust with respect to different connectivity constraints.

The results reported in this study are based on a model of AD where synapse loss and synaptic scaling are confined to connections between excitatory neurons (EE). The motivation for restricting our investigation to the loss of EE connections is that this appears to be a prominent feature in many cortical areas [7, 71, 105]. Evidence that other types of synapses are also damaged in the course of the disease has been gathered from several mouse models. For example, inhibitory synapses from neurons in the entorhinal cortex to excitatory CA1 hippocampal neurons have been found to be selectively degenerated in AD mice [95]. Our mean field theoretical results suggest that a global unspecific synapse loss affecting all types of connections (EE, EI, IE, II) leads to noticeable changes in firing rates and linear stability characteristics, but only for higher levels of synapse loss (more than 50%; see, S2 Fig of Supplementary Material). In this scenario, a recovery of firing rates by a synapse unspecific scaling of synaptic weights largely preserves the linear stability characteristics, similar to our findings obtained for a EE synapse loss and EE synapse scaling. This suggests that the commonly reported scaling of EE synapses may well be a mechanism the brain employs to compensate for alterations in dynamical characteristics that are induced by other types of synapse loss.

Although synapse loss correlates best with the cognitive decline observed in AD, by focusing on this aspect, the current study neglects other physical manifestations of AD such as neuron death and alterations of intrinsic neuronal properties [138142]. These phenomena would affect both inhibition and excitation in the network, so the changes of the resulting firing rate may well be non-monotonic, unlike in our model, having unpredictable effects on the the computational properties. Alternatively, they might be entirely unaffected: in a computational study, [143] showed that under some circumstances, a network can compensate for neuron loss without the need for additional homeostasis mechanisms by adjusting neuronal transfer functions. The contribution of intrinsic neuron properties to the claimed hyperexcitability of inhibitory neurons observed in AD has been previously investigated in a computational study by [144]. Whereas the interplay of such properties with synaptic loss and homeostasis are beyond the scope of the current work, our model could be extended to incorporate these aspects. However, there is as yet no consensus on which cell type shows hyperactivity [145] or hypoactivity [146]; which moreover may vary over the course of the disease [91, 147].

Analogously to our focus on synaptic loss in EE connections, we also restricted our investigation of firing rate homeostasis to EE synapse growth. This is motivated by the findings that intense synaptic upscaling is observed in AD and that an increase of excitatory-excitatory connections has been reported as a main compensation mechanism that increases the firing rate in hippocampal and cortical neurons after an artificially induced decrease in activity (e.g. by blocking sodium channels (TTX) or glutamatergic synapses or AMPAR [148154]). In Sec. Effect of local homeostasis on perturbation sensitivity we show that, regardless of the type of homeostatic regulation (global or local), sensitivity is a unique function of the firing rate. For local homeostasis, we only focus on postsynaptic regulation and neglect presynaptic adaption such as changes in release probability and the size of vesicle pools [34, 152, 154164]. Since our results are robust with respect to the two types of homeostasis we investigated here, we expect that our results are also robust with respect to the model of synaptic plasticity. Apart from synaptic scaling, other mechanisms that increase the network’s firing rate could also be considered, e.g. changes in current flow of ions (e.g., [165, 166]) or moving the spike-initiation zone [167].

In order to understand the complexity of Alzheimer’s disease, it is important to study the effects of the different observed morphological alterations caused by AD, their corresponding homeostatic responses and, crucially, how they interfere with each other.

The findings of our study suggest that homeostatic synaptic scaling might be an attractive target for drug development. However, some caution is required. Firstly, as discussed above, during early AD the neuronal activity seems to be increased, followed by a decrease. Thus, enhancing EE synaptic scaling at the very beginning of AD manifestation could even accelerate the progression of the disease. In the later stages of the disease, supporting synaptic scaling might be beneficial, stabilizing the cognitive performance. Within this context, there are a variety of molecular substrates that regulate synaptic scaling, and which show altered expression patterns in AD, that could be considered as treatment targets, for example MSK1, PSD-95, BDNF, Arc, Calcineurin, CaMK4 and Cdk5 (for reviews see [168]). A major challenge is to determine whether the altered concentrations of these substrates are a consequence of direct AD pathology, or arise as an attempt of the organism to counteract pathology, or even a mixture of both. Thus, in addition to more comprehensive modelling investigations, further research on the exact timeline of morphological changes and their functional implications is needed to identify promising therapeutic targets.

So far, we have related synapse loss and homeostasis solely to observations made in Alzheimer’s disease. Thus, the present study shows that certain cognitive deficits in Alzheimer’s disease may be attributed to changes in the stability characteristics of neuronal network dynamics. Its central aim is to contribute a deeper insight into the relationship between disease-related alterations at the structural, the dynamical and the cognitive levels. The findings of this study are also applicable in an entirely different context: in the face of limited computational resources, neuronal network models are often downscaled by reducing the number of nodes or the number of connections while increasing their strength. This downscaling has limitations if dynamical features such as the temporal structure of correlations in the neuronal activity are to be maintained [169]. The present work demonstrates that certain functional characteristics such as the sensitivity to perturbations or the classification performance can be largely preserved, if the synaptic weights are not limited by biological constraints. This insight may be particularly relevant for cognitive-computing applications based on recurrent neuronal networks implemented in neuromorphic hardware [170]. Here, the realization of natural-density connectivity and communication constitute a major bottleneck, whereas the strength of connections is hardly limited.

Methods

Network model

Network description.

The network consists of N = NE + NI identical leaky integrate-and-fire neurons, subdivided into a population of NE = 1000 excitatory and a population of NI = NE/4 inhibitory neurons. In the intact reference network, each excitatory (inhibitory) neuron receives local excitatory inputs from KEE = ϵNE (KIE = ϵNE) randomly selected excitatory neurons, and inhibitory inputs from KEI = ϵNI (KII = ϵNI) randomly selected inhibitory neurons. In addition, the neurons in the local circuit are driven by external excitatory inputs modeled as an ensemble of p Poissonian spike trains with constant rate νX. Each of these external spike trains is sent to a subset of randomly selected (excitatory and inhibitory) neurons in the network. Synaptic interactions are implemented in the form of stereotype exponential postsynaptic currents with a time constant τs. The strength Jij of interaction between two neurons j and i, the synaptic weight, is parameterized by the amplitude of the postsynaptic potential of neuron i evoked by an incoming spike from neuron j. In the reference network, all excitatory connections and all inhibitory connections, respectively, have equal synaptic weights, i.e. JEE = JIE = J and JEI = JII = −gJ. The greater number of excitatory inputs is compensated by a larger amplitude of inhibitory synaptic weights (g = 6).

Unless stated otherwise, the network simulations are repeated for M = 10 random realizations of network connectivity, initial conditions and external inputs for each parameter configuration. A detailed description of the network model components, dynamics and parameters is given in the Supplementary Material (S1S4 Tables). Simulations were performed using NEST (www.nest-simulator.org) version 2.10.0 [171]. All scripts for generating and plotting the data are online at http://doi.org/10.5281/zenodo.3752777.

AD implementation.

Unless stated otherwise, AD is implemented by a systematic reduction of excitatory synapses to excitatory neuron (EE synapses), by reconnecting the same network with a smaller EE in-degree KEE. All other in-degrees (KIE, KEI, KII) are preserved. In Sec. The interplay of hypo- and hyperactivity and its effect on E/I balance and perturbation sensitivity, we moreover study the effects of hyperactivity, which we induce by systematically increasing the weight JEE of EE synapses such that JEE > JIE (both in the presence and absence of synapse loss).

Global homeostasis.

In the presence of global firing-rate homeostasis, the removal of EE connections is compensated by increasing the weights JEE of the remaining EE synapses such that the time and population averaged firing rate is preserved. Here, si(t) denotes the spike train generated by neuron i (see below), and T = 1s the simulation time. The upscaling of the EE weights JEE is performed through bisectioning with an initial weight increment ΔJEE = JEE. The algorithm is stopped once the population averaged firing rate ν matches the rate of the corresponding intact reference network up to a precision of 0.5%. In the case of limited homeostasis, JEE is set to 1.2⋅J if the solution of the bisectioning exceeds 120% of the reference weight J. The weights JIE, JEI and JII of all other connections are not changed by the firing rate homeostasis.

Local homeostasis.

The local homeostasis implementation used in Sec. Effect of local homeostasis on perturbation sensitivity employs a form of structural plasticity which aims at maintaining the cell-specific time-averaged firing rates in the presence of synapse loss. The details of the structural plasticity mechanism are described in [101]. Associated parameter values are given in the S4 Table in the Supplementary Material. Briefly: The employed structural plasticity model generates new or removes existing incoming excitatory connections locally, i.e. for each excitatory neuron, based on the intracellular calcium concentration [Ca2+](t). The intracellular calcium concentration [Ca2+](t) is modeled as the low-pass filtered spiking activity of the postsynaptic cell (with time constant τlh and calcium intake per spike βlh). Synapses are generated or removed according to the time dependent local synapse count z(t). The change in z(t) is governed by a Gaussian growth curve (1) with ξlh = (ηlh + ϵlh)/2 and . Here, κlh denotes the growth rate, ϵlh the target calcium concentration (proportional to the firing rate of the postsynaptic neuron), and ηlh the minimum calcium concentration needed to create new synapses. The synaptic connectivity is updated according to the described dynamics in discrete time steps of size Δtlh. Newly formed synapses onto excitatory neurons are randomly and independently assigned to presynaptic neurons in the entire excitatory cell population. Multiple connections between two neurons are allowed. Newly established excitatory synapses are assigned a synaptic weight Jlh. In Sec. Effect of local homeostasis on perturbation sensitivity we consider two different cases where new synapses are either weak (Jlh = 0.1mV) or of the same strength as excitatory synapses in the initial intact network (Jlh = J).

Synaptic contact area and characterization of network activity

Relative total synaptic contact area.

We calculate the total synaptic contact area (TSCA) of the EE synapses as the product JEE KEE of the EE weight JEE and the EE in-degree KEE. The (2) is given by the ratio of the TSCA of the neurodegenerated network (reduced in-degree KEE) and the TSCA of the corresponding intact reference network (full in-degree KEE) with identical weights JIE, JEI and JII.

Spiking activity.

We represent the spike train si(t) = ∑k δ(tti,k) of neuron i (i ∈ [1, N]) as the superposition of Dirac-delta functions centered about the spike times ti,k (k = 1, 2, …). The spike count ni(t; b) is given by the number of spikes emitted in the time interval [t, t + b]. For subsequent analyses, we further compute the low-pass filtered spiking activity xi(t) = (si * h)(t) of neuron i as the linear convolution of its spike train si(t) with an exponential kernel h(t) = exp(−t/τf)Θ(t) with time constant τf and Heaviside step function Θ(t).

Average firing rate.

The time and population averaged firing rate is given by the total number of spikes emitted in the time interval [0, T], normalized by the network size N and the observation time T = 10s.

Fano factor.

As a global measure of spiking synchrony, we employ the Fano factor (3) of the population spike count for a binsize b = 10ms. 〈n(t; b)〉t and Vart(n(t; b) denote the mean and the variance of the population spike count n(t; b) across time, respectively. Here, we exploit the fact that the variance of a sum signal n(t) is dominated by pairwise correlations between the individual components ni(t), if the number N of components is large (see, e.g., [172, 173]). Normalization by the mean 〈n(t; b)〉t ensures that FF(b) does not trivially depend on the firing rate or the binsize b. For an ensemble of N independent realizations of a stationary Poisson process, FF(b) = 1, irrespective of b and the firing rate. In this work, an increase in FF indicates an increase in synchrony on a time scale b.

Coefficient of variation.

The degree of spiking irregularity of neuron i is quantified by the coefficient of variation CVi = SDk(τi,k)/〈τi,kk of the inter-spike intervals τi,k = ti,kti,k−1, i.e. the ratio between the standard deviation SDk(τi,k) and the mean 〈τi,kk. For a stationary Poisson point process, CVi = 1, irrespective of its firing rate. CV’s larger (smaller) than 1 correspond to spike trains that are more (less) regular than a stationary Poisson process. We measure CVi over a time interval T = 10s, and report the population average .

EI-balance.

We define the EI-balance B for each neuron as the ratio of the sums of the incoming currents from excitatory (IE) and inhibitory inputs (II). Thus, the averaged EI– balance for excitatory neurons is BE = IEE/IEI with IEE = JEEKEEνE and IEI = JEIKEIνI and, accordingly, for inhibititory neurons BI = IIE/III with IIE = JIEKIEνE and III = JIIKIIνI. We define the total EI-balance as the average across all neurons B ≔ (NEBE + NIBI)/N

Sensitivity to perturbation.

We examine the sensitivity of a network to a small perturbation in the input spikes by performing two simulations with identical initial conditions and identical realizations of external inputs. In the second run, we apply a small perturbation by delaying one spike in one external Poisson input at time t* = 400ms by δt* = 0.5ms. As a measure of the network’s perturbation sensitivity, we compute the Pearson correlation coefficient (4) of the low-pass filtered spike responses xi(t) and in the unperturbed and perturbed simulation, respectively, for each time point t. Here, δxi(t) = xi(t) − 〈xi(t)〉i denotes the deviation of the low-pass filtered spike response xi(t) of neuron i from the population average 〈xi(t)〉i. represents the population average. We define the time-dependent and the long-term perturbation sensitivity as S(t) = 1 − |R(t)| (Fig 6, bottom panels) and S = S(tobs = 10s) (Fig 7), respectively. An observation of S(tobs) = 0 indicates that the effect of the small perturbation has vanished, i.e. that the network has stable dynamics and is insensitive to the perturbation. An observation of S(tobs) = 1, in contrast, corresponds to diverging spike patterns in response to the perturbation and thus chaotic dynamics. In dynamical-systems theory and related applications, state differences are typically expressed in terms of the Euclidean distance . Here, we employ the (normalized) correlation coefficient R instead to avoid (trivial) firing rate dependencies. Note that D and R are redundant in the sense that both can be expressed in terms of the moments , 〈xi(t)2i, , 〈xii and .

Linearized network dynamics and stability analysis

In the following, we describe the analytical approach to investigate the effect of synapse loss and firing rate homeostasis on the network’s linear-stability characteristics. To this end, we employ results obtained from the diffusion approximation of the leaky-integrate-and-fire (LIF) neuron with exponential postsynaptic currents under the assumption that the synaptic time constant τs is small compared to the membrane time constant τm, and that the network activity is sufficiently asynchronous and irregular (mean-field theory; [133, 134, 174]). All parameters that are not explicitly mentioned here can be found in the S1 Table in the Supplementary Material.

Stationary firing rates and fixed points.

For each parameter set (synaptic weight J, extent of synapse loss, different types of firing rate homeostasis), we first identify the self-consistent stationary states by solving (5) for the population averaged firing rates νE and νI of the excitatory and inhibitory subpopulations. Here, (6) represents the stationary firing rate of the LIF neuron in response to a synaptic input current with mean μ and variance σ2 in diffusion approximation, with , , and (with Riemann zeta function ζ; [133, 134, 174]). For stationary firing rates νE and νI of the local presynaptic neurons, the mean and the variances of the total synaptic input currents to excitatory and inhibitory neurons are given by (7) respectively. The coefficients Kpq (p, q ∈ {E, I}) denote the number of inputs (in-degree) to neurons in population p from population q, the corresponding rescaled PSC amplitude, KX the number of external inputs for each neuron in the network, and νX the firing rate of the Poissonian external sources. Note that in our network simulations, each external source is connected to a randomly selected subset of neurons. As a result, the number KX of external inputs each neuron in the network receives is a binomially distributed random number. For the analytical treatment, we neglect this variability and replace KX by the average . Eqs 5 and 7 are simultaneously solved numerically using the optimize.root() function (method=‘hybr’) of the scipy package (http://www.scipy.org). To ensure that all solutions are found, the fixed-point search is repeated for 30 pairs of initial rates randomly drawn from a uniform distribution between 0 and 50spikes/s. If multiple coexisting fixed points are found, the one with the highest firing rates is chosen for the subsequent analysis.

Synapse loss and firing rate homeostasis.

In this work, Alzheimer’s disease is modeled by removing a fraction of EE synapses, i.e. by reducing the in-degree KEE. The self-consistent firing rates νE and νI after synapse removal are hence reduced (Fig 12D). In the presence of unlimited firing rate homeostasis, we adjust the weight (of the remaining synapses) (Fig 12B) until the excitatory self-consistent firing rate of the intact reference network (before synapse removal) is recovered (Fig 12E). To this end, we numerically find the roots of by employing again scipy’s optimize.root() function. We repeat the root finding for 30 initial weights randomly drawn from a uniform distribution between and , where denotes the original weight before synapse removal, and keep the solution where is minimal. For limited homeostasis, the new EE weight is chosen as the minimum of the solution and (Fig 12C and 12F).

thumbnail
Fig 12. Mean-field theory.

Dependence of the synaptic weight (AC), the average firing rate νE of the excitatory population (DF), the effective weight wEE of EE connections (GI), the ratio (JL), and the spectral radius ρ (MO) on the synaptic weight J and the degree of synapse loss in the absence of homeostatic compensation (left column), as well as with unlimited (middle column) and limited firing rate homeostasis (right column). Superimposed black curves in (M–O) mark instability lines ρ = 1. Same parameters as in network simulations (see, S1 and S2 Tables in Supplementary Material).

https://doi.org/10.1371/journal.pcbi.1007790.g012

Linearized network dynamics and effective connectivity.

As shown in [173, 174], networks of spiking neurons can be formally linearized about a stationary state (linear-response theory) and thereby be mapped to an N-dimensional system (8) of linear equations describing the dynamics of small firing rate fluctuations around this stationary state. The stationary states are determined as the self-consistent solutions of (9) where ϕ(νin) represents the activation function mapping the vector of stationary input rates νin to the vector of output rates. The coupling kernel hij(t) represents the firing rate impulse response, i.e. the modulation in the output rate νi(t) in response to a delta-shaped fluctuation in the rate νj(t) of presynaptic neuron j.

We refer to the area (10) under the coupling kernel as the effective connection weight. It measures the average number of extra spikes emitted by target neuron i in response to a spike fired by the presynaptic neuron j, in the context of the background activity determined by the stationary state ν*.

Exploiting the fact that the integral of the impulse response of a linear(ized) system is identical to the long-term limit of its step response, the effective weight (11) is given by the derivative of the activation function ϕi of neuron i with respect to the stationary firing rate νj of neuron j, evaluated at the stationary state ν*. With ϕi(ν) = G(μi(ν), σi(ν)) from (6), , and , we obtain (12) as the effective weight of the LIF neuron in the stationary self-consistent state given by ν* [173, 174]. Note that for the result on the right-hand side of (12), we account only for the derivative of G with respect to the mean input μi (DC susceptibility), but neglect the contribution resulting from a modulation in the input variance . Removal of EE synapses and the resulting decrease in stationary firing rates (Fig 12D) leads to a reduction in the effective weight wEE of EE connections (Fig 12G). In the presence of (unlimited) firing rate homeostasis, upscaling of EE synapses (Fig 12B) and the resulting preservation of firing rates (Fig 12E) results in an increase in wEE (Fig 12H).

Stability analysis.

For the LIF neuron with weak exponential synapses [174] as well as for a variety of other neuron and synapse models [130132], the effective coupling kernel hij(t) introduced in (8) can be well approximated by an exponential function hij(t) = wij τ−1exp(−t/τ)Θ(t) with an effective time constant τ and Heaviside function Θ(t). With this approximation, (8) can be written in form of an N-dimensional system of differential equations (13)

Here, W = {wij} denotes the N × N effective connectivity matrix and δν(t) = (δν1(t), …, δνN(t)) the vector of firing rate fluctuations. The system (13) has bounded solutions only if the real parts of all Eigenvalues λk of the effective connectivity matrix W are smaller than unity, i.e. if Re(λk) < 1 (∀k). If ρ = maxk(Re(λk)) > 1, the linearized system is unstable and fluctuations diverge. In the original nonlinear LIF network, an unbounded growth of fluctuations is prevented by the nonlinearities of the single-neuron dynamics. For large random networks where the statistics of the coupling strengths does not depend on the target nodes, the bulk of Eigenvalues {λk|k ∈ [1, N]} of W is located in the complex plane within a circle centered at the coordinate origin and a radius ρ which is determined by the variances of the effective connectivity [175]. A single outlier is given by the Eigenvalue λk* associated with the Eigenvector uk* = (1, 1, …, 1, 1)T, which is given by the mean effective weight. In inhibition dominated networks, the mean synaptic weight and, hence, λk* are negative. The stability behaviour is therefore solely determined by the spectral radius ρ. For a random network composed of NE excitatory (; ) and NI inhibitory neurons (; ) with homogeneous in-degrees Kpq (p, q ∈ {E, I}) and weights (14) the squared spectral radius is given by (15)

Here, and denote the variances of the effective connectivity wij across the ensemble of target cells (i ∈ [1, N]) for excitatory () and inhibitory sources (), respectively. Without homeostatic compensation, EE synapse loss leads to a stabilization of the linearized network dynamics, i.e. a decrease in ρ (Fig 12M). In the presence of unlimited firing rate homeostasis, the spectral radius ρ is preserved (Fig 12N), even if a substantial fraction of EE synapses is removed (Fig 12N). If the homeostatic resources are limited, ρ is maintained until the upscaled synaptic weights reach their maximum value (Fig 12O).

Preservation of linear stability by firing rate homeostasis.

At first glance, it is unclear why firing rate homeostasis preserves the linear stability characteristics as measured by the spectral radius ρ. While the stationary firing rates are, by definition, kept constant during synapse loss and homeostasis, the input statistics , (S1D and S1E Fig in Supplementary Material), and (Fig 13A and 13B) as well as the effective weights wij (Fig 12K) are not. To shed light on the mechanisms leading to the preservation of ρ, we first note that the factor (16) on the right-hand side of (12) is in good approximation uniquely determined by the stationary firing rate (Fig 12J–12L). This can be understood by noting that, according to (6), the firing rates are determined by and that can be approximated by an exponential function f(y) ≈ AeBy for the range of arguments spanned by and (Fig 13). With this approximation, . For constant firing rate, is therefore constant, too, and the effective weight is essentially determined by the ratio . With (p, q ∈ {E, I}), (15) reads (17)

thumbnail
Fig 13. Approximation of by an exponential function.

A,B) Dependence of yrE (A) and yθE (B) on the synaptic reference weight J and the degree of synapse loss in the presence of unlimited firing rate homeostasis (mean-field theory). C) Graph of (black) and exponential function AeBX (gray; A = 0.4, B = 2.5) fitted to f(y) in interval y ∈ [0.5, 1.5]. Same parameters as in network simulations (see, S1 and S2 Tables in Supplementary Material).

https://doi.org/10.1371/journal.pcbi.1007790.g013

According to our network simulations as well as the mean-field theory described above, stationary firing rates of the excitatory and inhibitory subpopulation are identical in the presence of firing rate homeostasis, i.e. . With (7) and assuming that the contribution of the external drive to the total input variances can be neglected (which is the case for the range of parameters considered in this study), we find that the spectral radius (18) is in good approximation uniquely determined by the stationary firing rate ν* (Fig 8A and S1 Fig in Supplementary Material). A constant firing rate (as achieved by firing rate homeostasis) is therefore accompanied by a constant spectral radius.

Unspecific synapse loss and homeostasis

In this section, we expand our analysis of the linearized network dynamics towards a network in which all types of synapses (EE,EI,IE,II) are removed. Accordingly, the homeostatic upscaling affects all types of synapses (EE, EI, IE, II) such that the target firing rate is reached by applying the same factor c to all synaptic weights and the initial proportion of the different synapse types is kept constant (cJEE = cJIE = cJ and cJEI = cJII = −cgJ with c ≥ 1).

We observe that synapse-unspecific network dilution leads to a drop in firing rate (S2D Fig), but this drop is not as pronounced as if only EE synapses are removed (Fig 12D). For small and moderate degrees of synapse loss, the firing rate changes only little. Upscaling J compensates for this and fully restores the firing rates (S2E Fig), even for high levels of synapse loss. In the absence of homeostasis, synapse unspecific network dilution reduces the spectral radius (S2M Fig), but this effect is weaker as if only EE synapses were removed (Fig 12M). For small and moderate degrees of synapse loss, the spectral radius is hardly affected. Upscaling J fully recovers the spectral radius in the stable regime (ρ < 1). Close to the transition from stable to unstable (ρ = 1, black contour line), recovery of the spectral radius is approximately achieved (S2N Fig).

Different bounds for limited homeostasis

In this section, we investigate the effects of constraining homeostasis to limited degrees of EE synapse growth. If EE synapse growth is limited to only 10%, sensitivity rapidly decreases for synapse losses larger than 10% (S3A Fig). In contrast, if synapses can increase their weights up to 40%, a shift towards the insensitive regime is only observed if more than 30% of the synapses are removed (S3C Fig). Limiting synapse growth to 20% restores the sensitivity as long as a maximum of 20% of the synapses are removed. (=S3B Fig).

Supporting information

S1 Table. Description of the network model according to [176].

https://doi.org/10.1371/journal.pcbi.1007790.s001

(PDF)

S2 Table. Network and simulation parameters.

https://doi.org/10.1371/journal.pcbi.1007790.s002

(PDF)

S3 Table. Parameters for evaluation of spike-train statistics and perturbation sensitivity.

https://doi.org/10.1371/journal.pcbi.1007790.s003

(PDF)

S4 Table. Parameters of the local homeostasis implementation and of the according sensitivity experiment.

https://doi.org/10.1371/journal.pcbi.1007790.s004

(PDF)

S1 Fig. Canceling of the synaptic-weight variance by the input variance.

Dependence of (A), (B), (C), input mean μE (D), input variance (E), and the ratio (F) on the synaptic reference weight J and the degree of synapse loss in the presence of unlimited firing rate homeostasis (mean-field theory). Note that (F) is very close to unity in all regions where νE > 0 (cf. Fig 12E). Hence, the ratio between the synaptic-weight variance and the synaptic-input variance is uniquely determined by the firing rate. Same parameters as in network simulations (see, S1 and S2 Tables in Supplementary Material).

https://doi.org/10.1371/journal.pcbi.1007790.s005

(EPS)

S2 Fig. Mean-field theory applied to network with unspecific synapse loss and unspecific synaptic upscaling.

We expand our analysis of the linearized network dynamics towards a network in which all types of synapses (EE,EI,IE,II) are removed. Accordingly, the homeostatic upscaling affects all types of synapses (EE, EI, IE, II) such that the target firing rate is reached by applying the same factor c to all synaptic weights and the initial proportion of the different synapse types is kept constant (cJEE = cJIE = cJ and cJEI = cJII = −cgJ with c ≥ 1). The figure shows the dependence of the synaptic weight (AC), the average firing rate νE of the excitatory population (DF), the effective weight wEE of EE connections (GI), the ratio (JL), and the spectral radius ρ (MO) on the synaptic weight J and the degree of synapse loss in the absence of homeostatic compensation (left column), as well as with unlimited (middle column) and limited firing rate homeostasis (right column). Superimposed black curves in (M–O) mark instability lines ρ = 1. Same parameters as in network simulations (see, S1 and S2 Tables in Supplementary Material). We observe that synapse-unspecific network dilution leads to a drop in firing rate (D), but this drop is not as pronounced as if only EE synapses are removed (Fig 12D). For small and moderate degrees of synapse loss, the firing rate changes only little. Upscaling J compensates for this and fully restores the firing rates (E), even for high levels of synapse loss. In the absence of homeostasis, synapse unspecific network dilution reduces the spectral radius (M), but this effect is weaker as if only EE synapses were removed (Fig 12M). For small and moderate degrees of synapse loss, the spectral radius is hardly affected. Upscaling J fully recovers the spectral radius in the stable regime (ρ < 1). Close to the transition from stable to unstable (ρ = 1, black contour line), recovery of the spectral radius is approximately achieved (N).

https://doi.org/10.1371/journal.pcbi.1007790.s006

(EPS)

S3 Fig. Effect of synapse loss and limited firing rate homeostasis on perturbation sensitivity.

Dependence of perturbation sensitivity S on the synaptic reference weight J and the degree of EE synapse loss for different bounds of limited homeostasis. The different bounds are set such that synaptic weights cannot exceed 110% (A), 120% (B) and 140% (C) of their reference weight. If EE synapse growth is limited to only 10%, sensitivity rapidly decreases for synapse losses larger than 10% (A). In contrast, if synapses can increase their weights up to 40%, a shift towards the insensitive regime is only observed if more than 30% of the synapses are removed (C). Limiting synapse growth to 20% restores the sensitivity as long as a maximum of 20% of the synapses are removed (B).

https://doi.org/10.1371/journal.pcbi.1007790.s007

(EPS)

References

  1. 1. Zhan SS, Beyreuther K, Schmitt HP. Quantitative assessment of the synaptophysin immuno-reactivity of the cortical neuropil in various neurodegenerative disorders with dementia. Dementia. 1993;4(2):66–74. pmid:8358515
  2. 2. Brun A, Liu X, Erikson C. Synapse loss and gliosis in the molecular layer of the cerebral cortex in Alzheimer’s disease and in frontal lobe degeneration. Neurodegeneration. 1995;4(2):171–177. pmid:7583681
  3. 3. Morton AJ, Faull RLM, Edwardson JM. Abnormalities in the synaptic vesicle fusion machinery in Huntington’s diseasee. Brain Research Bulletin. 2001;56(2):111–117. pmid:11704347
  4. 4. Lin JW, Faber DS. Modulation of synaptic delay during synaptic plasticity. Trends Neurosci. 2002;25(9):449–455. pmid:12183205
  5. 5. Scheff WS, Neltner JH, Nelson PT. Is synaptic loss a unique hallmark of Alzheimer’s disease? Biochem Pharmacol. 2014;88(4):517–528. pmid:24412275
  6. 6. Sheng M, Sabatini BL, Südhof TC. Synapses and Alzheimer’s disease. Cold Spring Harb Perspect Biol. 2012;4(5):a005777. pmid:22491782
  7. 7. Dorostkar MM, Zou C, Blazquez-Llorca L, Herms J. Analyzing dendritic spine pathology in Alzheimer’s disease: problems and opportunities. Acta Neuropathol. 2015;130(1):1–19. pmid:26063233
  8. 8. Tampellini D. Synaptic activity and Alzheimer’s disease a critical update. Frontiers in Neuroscience. 2015;9:432–439.
  9. 9. Tönnies E, Trushina E. Oxidative Stress, Synaptic Dysfunction, and Alzheimer’s Disease. Journal of Alzheimer’s disease. 2017;57(4):1105–1121. pmid:28059794
  10. 10. Frere S, Slutsky I. Alzheimer’s Disease: From Firing Instability to Homeostasis Network Collapse. Neuron. 2018;97(1):32–58. pmid:29301104
  11. 11. Rajendran L, Paolicelli RC. Microglia–Mediated Synapse Loss in Alzheimer’s Disease. The Journal of Neuroscience. 2018;38(12):2911–2919. pmid:29563239
  12. 12. DeKosky ST, Scheff SW. Synapse loss in frontal cortex biopsies in Alzheimer’s disease: Correlation with cognitive severity. Ann Neurol. 1990;27(5):457—464. pmid:2360787
  13. 13. Scheff WS, DeKosky ST, Price DA. Quantitative assessment of cortical synaptic density in Alzheimer’s disease. Neurobiology of Aging. 1990;11(1):29–37. pmid:2325814
  14. 14. Terry RD, Masliah E, Salmon DP, Butters N, DeTeresa R, Hill R, et al. Physical basis of cognitive alterations in Alzheimer’s disease: Synapse loss is the major correlate of cognitive impairment. Annals of Neurology. 1991;30(4):572–580. pmid:1789684
  15. 15. Scheff WS, Price DA. Synapse loss in the temporal lobe in Alzheimer’s disease. Ann Neurol. 1993;33(2):190–199. pmid:8434881
  16. 16. Masliah E, Mallory M, Hansen L, DeTeresa R, Alford M, Terry R. Synaptic and neuritic alterations during the progression of Alzheimer’s disease. Neuroscience Letters. 1994;174(1):67–72. pmid:7970158
  17. 17. Scheff SW, Price DA. Synaptic pathology in Alzheimers disease: a review of ultrastructural studies. Neurobiology of Aging. 2003;24(8):1029—1046. pmid:14643375
  18. 18. Scheff WS, Price DA. Alzheimer’s disease-related alterations in synaptic density: Neocortex and hippocampus. Journal of Alzheimer’s disease: JAD. 2006;9(3 Suppl):101–115. pmid:16914849
  19. 19. Scheff WS, Price DA, Schmitt FA, Scheff MA, Mufson EJ. Synaptic Loss in the Inferior Temporal Gyrus in Mild Cognitive Impairment and Alzheimer Disease. J Alzheimers Dis. 2011;24(3):547–557. pmid:21297265
  20. 20. Bennett DA, Wilson RS, Schneider JA, Evans DA, Beckett LA, Aggarwal NT, et al. Natural history of mild cognitive impairment in older persons. Neurology. 2002;59(2):198–205. pmid:12136057
  21. 21. Weintraub S, Wicklund AH, Salmon DP. The Neuropsychological Profile of Alzheimer Disease. Cold Spring Harb Perspect Med. 2012;2(4):a006171. pmid:22474609
  22. 22. Smith AD. Imaging the progression of Alzheimer pathology through the brain. Proc Natl Acad Sci U S A. 2002;99(7):4135–4137. pmid:11929987
  23. 23. de Toledo-Morrell L, Dickerson B, Sullivan MP, Spanovic C, Wilson R, Bennett DA. Hemispheric differences in hippocampal volume predict verbal and spatial memory performance in patients with Alzheimer’s disease. Hippocampus. 2000;10(2):136–142. pmid:10791835
  24. 24. Thompson PM, Hayashi KM, de Zubicaray G, Janke AL, Rose SE, Semple J, et al. Dynamics of Gray Matter Loss in Alzheimer’s Disease. The Journal of Neuroscience. 2003;23(3):994–1005. pmid:12574429
  25. 25. Chen MK, Mecca AP, Naganawa M, Finnema SJ, Toyonaga T, Lin SF, et al. Assessing synaptic density in Alzheimer disease with synaptic vesicle glycoprotein 2A positron emission tomographic imaging. JAMA Neurology. 2018;75(10):1215–1224. pmid:30014145
  26. 26. Sperling R, Mormino E, Johnson K. The evolution of preclinical Alzheimer’s disease: implications for prevention trials. Neuron. 2014;84(3):608–622. pmid:25442939
  27. 27. Jack CR Jr, Knopman DS, Jagust WJ, Shaw LM, Aisen PS, Weiner MW, et al. Hypothetical model of dynamic biomarkers of the Alzheimer’s pathological cascade. The Lancet Neurology. 2010;9(1):119–128.
  28. 28. Small DH. Mechanisms of Synaptic Homeostasis in Alzheimer’s Disease. Current Alzheimer Research. 2004;1(1):27–32. pmid:15975082
  29. 29. Fernandes D, Carvalho AL. Mechanisms of homeostatic plasticity in the excitatory synapse. J Neurochem. 2016;139(6):973–996. pmid:27241695
  30. 30. Neuman KM, Molina-Campos E, Musial TF, Price AL, Oh KJ, Wolke ML, et al. Evidence for Alzheimer’s disease-linked synapse loss and compensation in mouse and human hippocampal CA1 pyramidal neurons. Brain Struct Funct. 2015;220(6):3143–65. pmid:25031178
  31. 31. Morris JC. Early-stage and preclinical Alzheimer disease. Alzheimer Dis Assoc Disord. 2005;19(3):163–165. pmid:16118535
  32. 32. Fröhlich F, Bazhenov M, Sejnowski TJ. Pathological effect of homeostatic synaptic scaling on network dynamics in diseases of the cortex. J Neurosci. 2008;28(7):1709–1720. pmid:18272691
  33. 33. Lütcke H, Margolis DJ, Helmchen F. Steady or changing? Long-term monitoring of neuronal population activity. Trends in Neurosciences. 2013;36(7):375–384. pmid:23608298
  34. 34. Slomowitz E, Styr B, Vertkin I, Milshtein-Parush H, Nelken I, Slutsky M, et al. Interplay between population firing stability and single neuron dynamics in hippocampal networks. eLife. 2015;4:e04378.
  35. 35. Zhou S, Yu Y. Synaptic E–I Balance Underlies Efficient Neural Coding. Front Neurosci. 2018;12:46. pmid:29456491
  36. 36. Keck T, Keller GB, Jacobsen RI, Eysel UT, Bonhoeffer T, Hübener M. Synaptic Scaling and Homeostatic Plasticity in the Mouse Visual Cortex InVivo. Neuron. 2013;80(2):327–334. pmid:24139037
  37. 37. Vitureira N, Goda Y. The interplay between Hebbian and homeostatic synaptic plasticity. J Cell Biol. 2013;203(2):175–186. pmid:24165934
  38. 38. Styr B, Slutsky I. Imbalance between firing homeostasis and synaptic plasticity drives early-phase Alzheimer’s disease. Nature Neuroscience. 2018;21(4):463–473. pmid:29403035
  39. 39. Horn D, Ruppin E, Usher M, Hermann M. Neural Network Modeling of Memory Deterioration in Alzheimer’s Disease. Neural Computation. 1993;5(5):736–749.
  40. 40. Horn D, Levy N, Ruppin E. Neuronal–Based Synaptic Compensation: A Computational Study in Alzheimer’s Disease. Neural Computation. 1996;8(6):1227–1243. pmid:8768393
  41. 41. Ruppin E, Reggia JA. A Neural Model of Memory Impairment in Diffuse Cerebral Atrophy. The British Journal of Psychiatry. 1995;166(1):19–28. pmid:7894871
  42. 42. Brunel N, Hakim V. Fast Global Oscillations in Networks of Integrate-and-Fire Neurons with Low Firing Rates. Neural Comput. 1999;11(7):1621–1671. pmid:10490941
  43. 43. Jaeger H, Haas H. Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science. 2004;304(5667):78–80. pmid:15064413
  44. 44. Eliasmith C, Anderson CH. Neural engineering: Computation, representation, and dynamics in neurobiological systems. MIT press; 2004.
  45. 45. Maass W, Natschläger T, Markram H. Real-time computing without stable states: a new framwork for neural compuation based on perturbation. Neural Comput. 2002;14(11):2531–2560. pmid:12433288
  46. 46. Buesing L, Bill J, Nessler B, Maass W. Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons. PLoS Comp Biol. 2011;7(11):e1002211.
  47. 47. Boerlin M, Machens CK, Denève S. Predictive coding of dynamical variables in balanced spiking networks. PLoS Computational Biology. 2013;9(11):e1003258. pmid:24244113
  48. 48. Abbott LF, DePasquale B, Memmesheimer RM. Building functional networks of spiking model neurons. Nat Neurosci. 2016;19(3):350–355. pmid:26906501
  49. 49. Legenstein R, Maass W. Edge of chaos and prediction of computational performance for neural circuit models. Neural Networks. 2007;20(3):323–334. pmid:17517489
  50. 50. Legenstein R, Maass W. What makes a dynamical system computationally powerful. New directions in statistical signal processing: From systems to brain. 2007; p. 127–154.
  51. 51. Langton CG. Computation at the edge of chaos: phase transitions and emergent computation. Physica D: Nonlinear Phenomena. 1990;42(1-3):12–37.
  52. 52. Schrauwen B, Büsing L, Legenstein RA. On computational power and the order-chaos phase transition in reservoir computing. In: Advances in Neural Information Processing Systems. vol. 21; 2009. p. 1425–1432.
  53. 53. Dambre J, Verstraeten D, Schrauwen B, Massar S. Information processing capacity of dynamical systems. Scientific Reports. 2012;2:514. pmid:22816038
  54. 54. Schuecker J, Goedeke S, Helias M. Optimal sequence memory in driven random networks. arXiv. 2017;.
  55. 55. Scholl DA. THE ORGANIZATION OF THE CEREBRAL CORTEX. New York: Springer-Verlag; 1956.
  56. 56. Abeles M. Local Cortical Circuits: An Electrophysiological Study. Studies of Brain Function. Berlin, Heidelberg, New York: Springer-Verlag; 1982.
  57. 57. DeFelipe J, Fariñas I. The pyramidal neuron of the cerebral cortex: Morphological and chemical characteristics of the synaptic inputs. Progress in Neurobiology. 1992;39(6):563–607. pmid:1410442
  58. 58. Gulyás AI, Megías M, Emri Z, Freund TF. Total Number and Ratio of Excitatory and Inhibitory Synapses Converging onto Single Interneurons of Different Types in the CA1 Area of the Rat Hippocampus. J Neurosci. 1999;19(22):10082–10097. pmid:10559416
  59. 59. Binzegger T, Douglas RJ, Martin KAC. A Quantitative Map of the Circuit of Cat Primary Visual Cortex. J Neurosci. 2004;24(39):8441–8453. pmid:15456817
  60. 60. Lefort S, Tomm C, Floyd Sarria JC, Petersen CCH. The Excitatory Neuronal Network of the C2 Barrel Column in Mouse Primary Somatosensory Cortex. Neuron. 2009;61(2):301–316. pmid:19186171
  61. 61. Tomko GJ, Crapper DR. Neuronal variability: non-stationary responses to identical visual stimuli. Brain Research. 1974;79(3):405–418. pmid:4422918
  62. 62. Softky WR, Koch C. The Highly Irregular Firing of Cortical Cells Is Inconsistent with Temporal Integration of Random EPSPs. J Neurosci. 1993;13(1):334–350. pmid:8423479
  63. 63. Shadlen MN, Newsome WT. The Variable Discharge of Cortical Neurons: Implications for Connectivity, Computation, and Information Coding. J Neurosci. 1998;18(10):3870–3896. pmid:9570816
  64. 64. Ecker AS, Berens P, Keliris GA, Bethge M, Logothetis NK, Tolias AS. Decorrelated Neuronal Firing in Cortical Microcircuits. Science. 2010;327(5965):584–587. pmid:20110506
  65. 65. Petersen CC, Crochet S. Synaptic computation and sensory processing in neocortical layer 2/3. Neuron. 2013;78(1):28–48. pmid:23583106
  66. 66. Cowan RL, Wilson CJ. Spontaneous firing patterns and axonal projections of single corticostriatal neurons in the rat medial agranular cortex. J Neurophysiol. 1994;71:17–32. pmid:8158226
  67. 67. Timofeev I, Grenier F, Steriade M. Disfacilitation and active inhibition in the neocortex during the natural sleep-wake cycle: An intracellular study. Proc Natl Acad Sci USA. 2001;98(4):1924–1929. pmid:11172052
  68. 68. Steriade M, Nuñez A, Amzica F. A novel slow (< 1 Hz) oscillation of neocortical neurons in vivo: depolarizing and hyperpolarizing components. J Neurosci. 1993;13(8):3252–3265. pmid:8340806
  69. 69. Okun M, Lampl I. Instantaneous correlation of excitation and inhibition during ongoing and sensory-evoked activities. Nat Neurosci. 2008;11(5):535–537. pmid:18376400
  70. 70. Brunel N. Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. J Comput Neurosci. 2000;8(3):183–208. pmid:10809012
  71. 71. Lacor PN, Buniel MC, Furlow PW, Clemente AS, Velasco PT, Wood M, et al. Abeta oligomer-induced aberrations in synapse composition, shape, and density provide a molecular basis for loss of connectivity in Alzheimer’s disease. J Neurosci. 2007;27(4):796–807. pmid:17251419
  72. 72. Marder E, Goaillard JM. Variability, compensation and homeostasis in neuron and network function. Nat Rev Neurosci. 2006;7(7):563–574. pmid:16791145
  73. 73. Turrigiano GG. The Self-Tuning Neuron: Synaptic Scaling of Excitatory Synapses. Cell. 2008;135(3):422–435. pmid:18984155
  74. 74. Vitureira N, Letellier M, Goda Y. Homeostatic synaptic plasticity: from single synapses to neural circuits. Current Opinion in Neurobiology. 2012;22(3):516–521. pmid:21983330
  75. 75. Turrigiano GG. Homeostatic Synaptic Plasticity: Local and Global Mechanisms for Stabilizing Neuronal Function. Cold Spring Harb Perspect Biol. 2012;4(1):a005736. pmid:22086977
  76. 76. Song S, Sjöström PJ, Reigl M, Nelson S, Chklovskii D. Highly nonrandom features of synaptic connectivity in local cortical circuits. PLOS Biol. 2005;3(3):e68. pmid:15737062
  77. 77. Ikegaya Y, Sasaki T, Ishikawa D, Honma N, Tao K, Takahashi N, et al. Interpyramid Spike Transmission Stabilizes the Sparseness of Recurrent Network Activity. Cereb Cortex. 2013;23(2):293–304. pmid:22314044
  78. 78. Major G, Larkum ME, Schiller J. Active Properties of Neocortical Pyramidal Neuron Dendrites. Annu Rev Neurosci. 2013;36:1–24. pmid:23841837
  79. 79. Sompolinsky H, Crisanti A, Sommers HJ. Chaos in Random Neural Networks. Phys Rev Lett. 1988;61:259–262. pmid:10039285
  80. 80. Ostojic S. Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons. Nat Neurosci. 2014;17:594–600. pmid:24561997
  81. 81. Engelken R, Farkhooi F, Hansel D, van Vreeswijk C, Wolf FR. Comment on “Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons. bioRxiv. 2015; p. 017798.
  82. 82. Ostojic S. Response to Comment on “Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons”. bioRxiv. 2015; p. 020354.
  83. 83. Kadmon J, Sompolinsky H. Transition to Chaos in Random Neuronal Networks. Phys Rev X. 2015;5:041030.
  84. 84. Harish O, Hansel D. Asynchronous Rate Chaos in Spiking Neuronal Circuits. PLoS Comput Biol. 2015;11(7):e1004266. pmid:26230679
  85. 85. Poisnel G, Hérald AS, El Tannir El Tayara N, Bourrin E, Volk A, Kober F, et al. Increased regional cerebral glucose uptake in an APP/PS1 model of Alzheimer’s disease. Neurobiology of Aging. 2012;33(9):1995–2005. pmid:22079157
  86. 86. Herholz K. Cerebral glucose metabolism in preclinical and prodromal Alzheimer’s disease. Expert Review of Neurotherapeutics. 2010;10(11):1667–1673. pmid:20977325
  87. 87. Amatniek JC, Hauser AW, DelCastillo Castaneda C, Jacobs DM, Marder K, Bell K, et al. Incidence and Predictors of Seizures in Patients with Alzheimer’s Disease. Epilepsia. 2006;47(5):867–872. pmid:16686651
  88. 88. Noebels J. A perfect storm: Converging paths of epilepsy and Alzheimer’s dementia intersect in the hippocampal formation. Epilepsia. 2011;52 Suppl 1(Suppl 1):39–46. pmid:21214538
  89. 89. Vossel KA, Beagle AJ, Rabinovici GD, Shu H, Lee SE, Naasan G, et al. Seizures and epileptiform activity in the early stages of Alzheimer disease. JAMA Neurology. 2013;70(9):1158–1166. pmid:23835471
  90. 90. Palop JJ, Chin J, Roberson ED, Wang J, Thwin MT, Bien-Ly N, et al. Aberrant Excitatory Neuronal Activity and Compensatory Remodeling of Inhibitory Hippocampal Circuits in Mouse Models of Alzheimer’s Disease. Neuron. 2007;55(5):697–711. pmid:17785178
  91. 91. Busche MA, Chen X, Henning HA, Reichwald J, Staufenbiel M, Sakmann B, et al. Critical role of soluble amyloid-β for early hippocampal hyperactivity in a mouse model of Alzheimer’s disease. Proc Natl Acad Sci U S A. 2012;109(22):740–745.
  92. 92. Findley CA, Bartke A, Hascup KN, Hascup ER. Amyloid Beta–Related Alterations to Glutamate Signaling Dynamics During Alzheimer’s Disease Progression. ASN Neuro. 2019;11:1759091419855541. pmid:31213067
  93. 93. Vico Varela E, Etter G, Williams S. Excitatory–inhibitory imbalance in Alzheimer’s disease and therapeutic significance. Neurobiology of Disease. 2019;127:605–615. pmid:30999010
  94. 94. Talantova M, Sanz-Blasco S, Zhang X, Xia P, Akhtar MW, Okamoto Si, et al. Aβ induces astrocytic glutamate release, extrasynaptic NMDA receptor activation, and synaptic loss. Proc Natl Acad Sci USA. 2013;110(27):E2518–E2527. pmid:23776240
  95. 95. Yang X, Yao C, Tian T, Li X, Yan H, Wu J, et al. A novel mechanism of memory loss in Alzheimer’s disease mice via the degeneration of entorhinal–CA1 synapses. Molecular Psychiatry. 2016;23:199–210. pmid:27671476
  96. 96. Garcia-Marin V, Blazquez-Llorca L, Rodriguez JR, Boluda S, Muntane G, Ferrer I, et al. Diminished perisomatic GABAergic terminals on cortical neurons adjacent to amyloid plaques. Front Neuroanat. 2009;3:28. pmid:19949482
  97. 97. Grutzendler J, Helmin K, Tsai J, Gan W. Various Dendritic Abnormalities Are Associated with Fibrillar Amyloid Deposits in Alzheimer’s Disease. Annals of the New York Academy of Sciences. 2007;1097:30–39. pmid:17413007
  98. 98. Busche MA, Eichhoff G, Adelsberger H, Abramowski D, Wiederhold KH, Haass C, et al. Clusters of Hyperactive Neurons Near Amyloid Plaques in a Mouse Model of Alzheimer’s Disease. Science. 2008;321(5896):1686–1689. pmid:18802001
  99. 99. Sanchez PE, Zhu L, Verret L, Vossel KA, Orr AG, Cirrito JR, et al. Levetiracetam suppresses neuronal network dysfunction and reverses synaptic and cognitive deficits in an Alzheimer’s disease model. Proc Natl Acad Sci USA. 2012;109(42):E2895–E2903. pmid:22869752
  100. 100. Hengen K, Torrado Pacheco A, McGregor JN, Van Hoose SD, Turrigiano GG. Neuronal Firing Rate Homeostasis Is Inhibited by Sleep and Promoted by Wake. Cell. 2016;165(1):180–191. pmid:26997481
  101. 101. Diaz-Pier S, Naveau M, Butz-Ostendorf M, Morrison A. Automatic Generation of Connectivity for Large-Scale Neuronal Network Models through Structural Plasticity. Front Neuroanatomy. 2016;10:57.
  102. 102. Nowke C, Diaz-Pier S, Weyers B, Hentschel B, Morrison A, Kuhlen TW, et al. Toward Rigorous Parameterization of Underconstrained Neural Network Models Through Interactive Visualization and Steering of Connectivity Generation. J Frontiers in Neuroinformatics. 2018;12(32):1662–5196.
  103. 103. Mendez MF, Catanzaro P, Doss RC, Arguello R, Frey WH. Seizures in Alzheimer’s Disease: Clinicopathologic Study. Journal of Geriatric Psychiatry and Neurology. 1994;7(4):230–233. pmid:7826492
  104. 104. Lam AD, Deck G, Goldman A, Eskandar EN, Noebels J, Cole AJ. Silent Hippocampal Seizures and Spikes Identified by Foramen Ovale Electrodes in Alzheimer’s Disease. Nature Medicine. 2017;23(6):678–680. pmid:28459436
  105. 105. Merino-Serrais P, Benavides-Piccione R, Blazquez-Llorca L, Kastanauskaite A, Rábano A, Avila J, et al. The influence of phospho-tau on dendritic spines of cortical pyramidal neurons in patients with Alzheimer’s disease. Brain. 2013;136(Pt 6):1913–1928. pmid:23715095
  106. 106. Dickerson BC, Salat DH, Greve DN, Chua EF, Rand-Giovannetti E, Rentz DM, et al. Increased hippocampal activation in mild cognitive impairment compared to normal aging and AD. Neurology. 2005;65(3):404–411.
  107. 107. O’Brien JL, O’Keefe KM, LaViolette PS, DeLuca AN, Blacker D, Dickerson BC, et al. Longitudinal fMRI in elderly reveals loss of hippocampal activation with clinical decline. Neurology. 2010;75(24):1969–1976.
  108. 108. Kriener B, Enger H, Tetzlaff T, Plesser HE, Gewaltig MO, Einevoll GT. Dynamics of self-sustained asynchronous-irregular activity in random networks of spiking neurons with strong synapses. Front Comput Neurosci. 2014;8:136. pmid:25400575
  109. 109. Golovko V, Savitsky Y, Maniakov N. Neural Networks for Signal Processing in Measurement Analysis and Industrial Applications: the Case of Chaotic Signal Processing. NATO SCIENCE SERIES SUB SERIES III COMPUTER AND SYSTEMS SCIENCES. 2003;185:119–144.
  110. 110. Beggs JM, Plenz D. Neuronal avalanches in neocortical circuits. J Neurosci. 2003;23(35):11167–11177. pmid:14657176
  111. 111. Friedman N, Ito S, Brinkman BAW, Shimono M, DeVille REL, Dahmen KA, et al. Universal Critical Dynamics in High Resolution Neuronal Avalanche Data. Phys Rev Lett. 2012;108(20):208102. pmid:23003192
  112. 112. Wieland S, Bernardi D, Schwalger T, Lindner B. Slow fluctuations in recurrent networks of spiking neurons. Physical Review E. 2015;92(4):040901.
  113. 113. Boyd S, Chua L. Fading memory and the problem of approximating nonlinear operators with Volterra series. IEEE Transactions on Circuits and Systems. 1985;32:1150–1165.
  114. 114. Bertschinger N, Natschläger T. Real-time computation at the edge of chaos in recurrent neural networks. Neural Comput. 2004;16(7):1413–1436. pmid:15165396
  115. 115. Sussillo D, Abbott LF. Generating coherent patterns of activity from chaotic neural networks. Neuron. 2009;63(4):544–557. pmid:19709635
  116. 116. Nicola W, Clopath C. Supervised learning in spiking neural networks with FORCE training. Nature Communications. 2017;8(1):2208. pmid:29263361
  117. 117. Maass W, Joshi P, Sontag ED. Computational aspects of feedback in neural circuits. PLOS Comput Biol. 2007;3(1):1–20.
  118. 118. Li G, Ramanathan K, Ning N, Shi L, Wen C. Memory dynamics in attractor networks. Comput Intell Neurosci. 2015;2015:191745.
  119. 119. Jun JK, Miller P, Hernández A, Zainos A, Lemus L, Brody CD, et al. Heterogenous Population Coding of a Short-Term Memory and Decision Task. The Journal of Neuroscience. 2010;30(3):916–929. pmid:20089900
  120. 120. Pereira U, Brunel N. Attractor Dynamics in Networks with Learning Rules Inferred from In Vivo Data. Neuron. 2018;99(1):277–238.e4.
  121. 121. Barak O, Sussillo D, Romo R, Tsodyks M, Abbott LF. From fixed points to chaos: three models of delayed discrimination. Prog Neurobiol. 2013;103:214–222. pmid:23438479
  122. 122. Murray JD, Bernacchia A, Roy NA, Constantinidis C, Romo R, Wang XJ. Stable population coding for working memory coexists with heterogeneous neural dynamics in prefrontal cortex. Proc Natl Acad Sci USA. 2017;114(2):394–399. pmid:28028221
  123. 123. Wagner C, Stucki JW. Construction of an Associative Memory using Unstable Periodic Orbits of a Chaotic Attractor. Journal of Theoretical Biology. 2002;215(03):375–384. pmid:12054844
  124. 124. Belleville S, Bherer L, Lepage E, Chertkow H, Gauthier S. Task switching capacities in persons with Alzheimer’s disease and mild cognitive impairment. Neuropsychologia. 2008;46(8):2225–2233. pmid:18374374
  125. 125. Baddeley AD, Baddeley HA, Bucks RS, Wilcock GK. Attentional control in Alzheimer’s disease. Brain. 2001;124(Pt8):1492–1508. pmid:11459742
  126. 126. Cullen B, Coen RF, Lynch CA, Cunningham CJ, Coakley D, Robertson IH, et al. Repetitive behaviour in Alzheimer’s disease: description, correlates and functions. International Journal of Geriatric Psychiatry. 2005;20(7):686–693. pmid:16021661
  127. 127. Naudé J, Cessac B, Berry H, Delord B. Effects of Cellular Homeostatic Intrinsic Plasticity on Dynamical and Computational Properties of Biological Recurrent Neural Networks. J Neurosci. 2013;33(38):15032–15043. pmid:24048833
  128. 128. Boucsein C, Tetzlaff T, Meier R, Aertsen A, Naundorf B. Dynamical response properties of neocortical neuron ensembles: multiplicative versus additive noise. J Neurosci. 2009;29(4):1006–1010. pmid:19176809
  129. 129. London M, Roth A, Beeren L, Häusser M, Latham PE. Sensitivity to perturbations in vivo implies high noise and suggests rate coding in cortex. Nature. 2010;466:123–128. pmid:20596024
  130. 130. Nordlie E, Tetzlaff T, Einevoll GT. Rate dynamics of leaky integrate-and-fire neurons with strong synapses. Front Comput Neurosci. 2010;4:149. pmid:21212832
  131. 131. Heiberg T, Kriener B, Tetzlaff T, Casti A, Einevoll GT, Plesser HE. Firing-rate models capture essential response dynamics of LGN relay cells. J Comput Neurosci. 2013;35(3):359–375. pmid:23783890
  132. 132. Heiberg T, Kriener B, Tetzlaff T, Einevoll GT, Plesser HE. Firing-rate models for neurons with a broad repertoire of spiking behaviors. J Comput Neurosci. 2018;45(2):103–132. pmid:30146661
  133. 133. Fourcaud N, Brunel N. Dynamics of the firing probability of noisy integrate-and-fire neurons. Neural Comput. 2002;14(9):2057–2110. pmid:12184844
  134. 134. Schuecker J, Diesmann M, Helias M. Modulated escape from a metastable state driven by colored noise. Phys Rev E. 2015;92:052119.
  135. 135. Grytskyy D, Tetzlaff T, Diesmann M, Helias M. Invariance of covariances arises out of noise. AIP Conf Proc. 2013;1510:258–262.
  136. 136. Vegué M, Perin R, Roxin A. On the Structure of Cortical Microcircuits Inferred from Small Sample Sizes. J Neurosci. 2017;37(35):8498–8510. pmid:28760860
  137. 137. Druckmann S, Feng L, Lee B, Yook C, Zhao T, Magee JC, et al. Structured Synaptic Connectivity between Hippocampal Region. Neuron. 2014;81(3):629–640. pmid:24412418
  138. 138. Hoxha E, Boda E, Montarolo F, Parolisi R, Tempia F. Excitability and Synaptic Alterations in the Cerebellum of APP/PS1 Mice. PLoS ONE. 2012;7(4):e347265.
  139. 139. Haghani M, Janahmadi M, Shabani M. Protective effect of cannabinoid CB1 receptor activation against altered intrinsic repetitive firing properties induced by Aβ neurotoxicity. Neuroscience Letters. 2012;507(1):33–37. pmid:22172925
  140. 140. Liu Q, Xie X, Lukas RJ, St John PA, Wu J. A Novel Nicotinic Mechanism Underlies β-Amyloid-Induced Neuronal Hyperexcitation. J Neurosci. 2013;33(17):7253–63. pmid:23616534
  141. 141. Corbett BF, Leiser SC, Ling HP, Nagy R, Breysse N, Zhang X, et al. Sodium Channel Cleavage Is Associated with Aberrant Neuronal Activity and Cognitive Deficits in a Mouse Model of Alzheimer’s Disease. Journal of Neuroscience. 2013;33(16):7020–7026. pmid:23595759
  142. 142. Eslamizade MJ, Saffarzadeh F, Mousavi SMM, Meftahi GH, Hosseinmardi N, Mehdizadeh M, et al. Alterations in CA1 pyramidal neuronal intrinsic excitability mediated by Ih channel currents in a rat model of amyloid beta pathology. Neuroscience. 2015;305:279–292. pmid:26254243
  143. 143. Barrett DG, Denève S, Machens CK. Optimal compensation for neuron loss. eLife. 2016;5:e12454. pmid:27935480
  144. 144. Perez C, Ziburkus J, Ullah G. Analyzing and Modeling the Dysfunction of Inhibitory Neurons in Alzheimer’s Disease. PLoS One. 2016;11(12):e0168800. pmid:28036398
  145. 145. Zilberter M, Ivanov A, Ziyatdinova S, Mukhtarov M, Malkov A, Alpár A, et al. Dietary energy substrates reverse early neuronal hyperactivity in a mouse model of Alzheimer’s disease. J Neurochem. 2013;125(1):157–171. pmid:23241062
  146. 146. Yun SH, Gamkrelidze G, Stine WB, Sullivan PM, Pasternak JF, Ladu MJ, et al. Amyloid-beta(1–42) reduces neuronal excitability in mouse dentate gyrus. Neurosci Lett. 2006;403(1–2):162–165. pmid:16765515
  147. 147. Orbán G, Völgyi K, Juhász G, Penke B, Kékesi KA, Kardos J, et al. Different electrophysiological actions of 24- and 72-hour aggregated amyloid-beta oligomers on hippocampal field population spike in both anesthetized and awake rats. Brain Research. 2010;1354:227–235. pmid:20659435
  148. 148. Lissin DV, Gomperts SN, Carroll RC, Christine CW, Kalman D, Kitamura M, et al. Activity differentially regulates the surface expression of synaptic AMPA and NMDA glutamate receptors. Proc Natl Acad Sci U S A. 1998;95(12):7097–7102. pmid:9618545
  149. 149. O’Brien RJ, Kamboj S, Ehlers MD, Rosen KR, Fischbach GD, Huganir RL. Activity-Dependent Modulation of Synaptic AMPA Receptor Accumulation. Neuron. 1998;21(5):1067–1078. pmid:9856462
  150. 150. Turrigiano GG, Leslie KR, Desai NS, Rutherford LC, Nelson SB. Activity-dependent scaling of quantal amplitude in neocortical neurons. Nature. 1998;391(6670):892–896. pmid:9495341
  151. 151. Watt AJ, van Rossum MC, MacLeod KM, Nelson SB, Turrigiano GG. Activity Coregulates Quantal AMPA and NMDA Currents at Neocortical Synapses. Neuron. 2000;26(3):659–670. pmid:10896161
  152. 152. Thiagarajan TC, Lindskog M, Tsien RW. Adaptation to Synaptic Inactivity in Hippocampal Neurons. Neuron. 2005;47(5):725–737. pmid:16129401
  153. 153. Ibata K, Sun Q, Turrigiano GG. Rapid Synaptic Scaling Induced by Changes in Postsynaptic Firing. Neuron. 2008;57(6):819–826. pmid:18367083
  154. 154. Kim J, Tsien RW. Synapse-specific adaptations to inactivity in hippocampal circuits achieve homeostatic gain control while dampening network reverberation. Neuron. 2008;58(6):925–937. pmid:18579082
  155. 155. Bacci A, Coco S, Pravettoni E, Schenk U, Armano S, Frassoni C, et al. Chronic Blockade of Glutamate Receptors Enhances Presynaptic Release and Downregulates the Interaction between Synaptophysin-Synaptobrevin-Vesicle-Associated Membrane Protein 2. J Neurosci. 2001;21(17):6588–6596. pmid:11517248
  156. 156. Branco T, Staras K, Darcy K, Goda Y. Local dendritic activity sets release probability at hippocampal synapses. Neuron. 2008;59(3):475–785. pmid:18701072
  157. 157. Burrone J, O‘Byrne M, Murthy VN. Multiple forms of synaptic plasticity triggered by selective suppression of activity in individual neurons. Nature. 2002;420(6914):414–418. pmid:12459783
  158. 158. Wierenga CJ, Ibata K, Turrigiano GG. Postsynaptic Expression of Homeostatic Plasticity at Neocortical Synapses. Journal of Neuroscience. 2005;25(11):2895–2905. pmid:15772349
  159. 159. Jakawich SK, Nasser HB, Strong MJ, McCartney AJ, Perez AS, Rakesh N, et al. Local Presynaptic Activity Gates Homeostatic Changes in Presynaptic Function Driven by Dendritic BDNF Synthesis. Neuron. 2010;68(6):1143–1158. pmid:21172615
  160. 160. Laviv T, Vertkin I, Berdichevsky Y, Fogel H, Riven I, Bettler B, et al. Compartmentalization of the GABAB receptor signaling complex is required for presynaptic inhibition at hippocampal synapses. J Neurosci. 2011;31(35):12523–12532. pmid:21880914
  161. 161. Lee S, Hjerling-Leffler J, Zagha E, Fishell G, Rudy B. The largest group of superficial neocortical GABAergic interneurons expresses ionotropic serotonin receptors. J Neurosci. 2010;30(50):16796–16808. pmid:21159951
  162. 162. Mitra A, Mitra SS, Tsien RW. Heterogeneous reallocation of presynaptic efficacy in recurrent excitatory circuits adapting to inactivity. Nat Neurosci. 2011;15(2):250–257. pmid:22179109
  163. 163. Murthy VN, Schikorski T, Stevens CF, Zhu Y. Inactivity produces increases in neurotransmitter release and synapse size. Neuron. 2001;32(4):673–682. pmid:11719207
  164. 164. Lee JS, Ho WK, Neher E, Lee SH. Superpriming of synaptic vesicles after their recruitment to the readily releasable pool. Proc Natl Acad Sci USA. 2013;110(37):15079–15084. pmid:23980146
  165. 165. Desai NS, Rutherford LC, Turrigiano GG. Plasticity in the intrinsic excitability of cortical pyramidal neurons. Nat Neurosci. 1999;2(6):515–520. pmid:10448215
  166. 166. Gibson JR, Bartley AF, Huber KM. Role for the Subthreshold Currents ILeak and IH in the Homeostatic Control of Excitability in Neocortical Somatostatin-Positive Inhibitory Neurons. Journal of Neurophysiology. 2006;96(1):420–432. pmid:16687614
  167. 167. Grubb MS, Burrone J. Activity-dependent relocation of the axon initial segment fine-tunes neuronal excitability. Nature. 2010;465(7301):1070–1074. pmid:20543823
  168. 168. Jang SS, Chung HJ. Emerging Link between Alzheimer’s Disease and Homeostatic Synaptic Plasticity. Neural Plasticity. 2016;2016(7969272):19.
  169. 169. van Albada SJ, Helias M, Diesmann M. Scalability of asynchronous networks is limited by one-to-one mapping between effective connectivity and correlations. PLOS Comput Biol. 2015;11(9):e1004490. pmid:26325661
  170. 170. Furber S. Large-scale neuromorphic computing systems. Journal of Neural Engineering. 2016;13(5):051001. pmid:27529195
  171. 171. Bos H, Morrison A, Peyser A, Hahne J, Helias M, Kunkel S, et al. NEST 2.10.0. Zenodo. 2015;.
  172. 172. Harris KD, Thiele A. Cortical state and attention. Nat Rev Neurosci. 2011;12:509–523. pmid:21829219
  173. 173. Tetzlaff T, Helias M, Einevoll GT, Diesmann M. Decorrelation of Neural-Network Activity by Inhibitory Feedback. PLOS Comput Biol. 2012;8(8):e1002596. pmid:23133368
  174. 174. Helias M, Tetzlaff T, Diesmann M. Echoes in correlated neural systems. New J Phys. 2013;15:023002.
  175. 175. Rajan K, Abbott LF. Eigenvalue spectra of random matrices for neural networks. Phys Rev Lett. 2006;97(18):188104. pmid:17155583
  176. 176. Nordlie E, Gewaltig MO, Plesser HE. Towards Reproducible Descriptions of Neuronal Network Models. PLOS Comput Biol. 2009;5(8):e1000456. pmid:19662159