Skip to main content
Advertisement
  • Loading metrics

Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks

Abstract

How the connectivity of cortical networks determines the neural dynamics and the resulting computations is one of the key questions in neuroscience. Previous works have pursued two complementary approaches to quantify the structure in connectivity. One approach starts from the perspective of biological experiments where only the local statistics of connectivity motifs between small groups of neurons are accessible. Another approach is based instead on the perspective of artificial neural networks where the global connectivity matrix is known, and in particular its low-rank structure can be used to determine the resulting low-dimensional dynamics. A direct relationship between these two approaches is however currently missing. Specifically, it remains to be clarified how local connectivity statistics and the global low-rank connectivity structure are inter-related and shape the low-dimensional activity. To bridge this gap, here we develop a method for mapping local connectivity statistics onto an approximate global low-rank structure. Our method rests on approximating the global connectivity matrix using dominant eigenvectors, which we compute using perturbation theory for random matrices. We demonstrate that multi-population networks defined from local connectivity statistics for which the central limit theorem holds can be approximated by low-rank connectivity with Gaussian-mixture statistics. We specifically apply this method to excitatory-inhibitory networks with reciprocal motifs, and show that it yields reliable predictions for both the low-dimensional dynamics, and statistics of population activity. Importantly, it analytically accounts for the activity heterogeneity of individual neurons in specific realizations of local connectivity. Altogether, our approach allows us to disentangle the effects of mean connectivity and reciprocal motifs on the global recurrent feedback, and provides an intuitive picture of how local connectivity shapes global network dynamics.

Author summary

The structure of connections between neurons is believed to determine how cortical networks control behaviour. Current experimental methods typically measure connections between small numbers of simultaneously recorded neurons, and thereby provide information on statistics of local connectivity motifs. Collective network dynamics are however determined by network-wide patterns of connections. How these global patterns are related to local connectivity statistics and shape the dynamics is an open question that we address in this study. Starting from networks defined in terms of local statistics, we develop a method for approximating the resulting connectivity by global low-rank patterns. We apply this method to classical excitatory-inhibitory networks and show that it allows us to predict both collective and single-neuron activity. More generally, our approach provides a link between local connectivity statistics and global network dynamics.

Introduction

One of the central questions in neuroscience is how the connectivity structure of cortical networks determines the collective dynamics of neural activity and their function. Experimental assessments of connectivity are typically based on measurements of synaptic weights between small numbers of neurons recorded simultaneously [19]. The most common approach to quantify connectivity therefore focuses on local statistics, and starts by characterizing the connection probability between pairs of neurons based on their type, before considering progressively more complex connectivity motifs. Linking these local connectivity statistics to the emerging network dynamics has been an active topic of investigations [1024]. A second approach, motivated by computational network models instead of experimental measurements [2529], instead specifies the connectivity in terms of a low-rank structure defined by network-wide patterns of connectivity [3039]. This global connectivity structure directly determines the low-dimensional dynamics and the resulting computations [30, 31, 33], yet it remains unclear how it is related to local connectivity statistics that can be recorded experimentally. In this study, we aim to bridge this gap, by mapping local connectivity statistics onto a global, low-rank description of connectivity and comparing the resulting dynamics.

Starting from random networks with connectivity defined in terms of local, cell-type dependent statistics, we develop a low-rank approximation based on the dominant eigenmodes of the connectivity matrix. Using perturbation theory, we show that the obtained low-rank connectivity patterns universally obey Gaussian-mixture statistics and therefore lead to analytically tractable dynamics [31, 33]. We specifically apply this approach to excitatory-inhibitory networks with connections consisting of independent and reciprocal parts, and exploit the low-rank approximation to predict the emerging dynamics.

We first show that, although the dominant low-rank structure is set on average by the mean synaptic weights [4042], a perturbative approach accurately accounts for the components of individual neurons on the dominant eigenvectors arising from individual instances of the random connectivity. As a result, our low-rank approximation analytically captures the activity of individual neurons in the original E-I network defined based on local statistics. The analytic description of the dynamics in the low-rank approximation moreover leads to the identification of two distinct sources of recurrent feedback corresponding respectively to the mean connectivity and reciprocal connections between neurons. In particular, the reciprocal motifs impact dynamics by modulating both the dominant eigenvalue and the corresponding eigenvectors, and can give rise to additional bistability in the network. Altogether, our analytical mapping of the local EI statistics to a low-rank description provides a quantitative and intuitive description of how local connectivity statistics determine global low-dimensional dynamics.

The paper is structured as follows. In Results Secs. 1.1, 1.2 we explain our setup for the local and global representations of random connectivity, and introduce the low-rank approximation, matrix determinant lemma and perturbation theory which are the basis of our analytical approach. We first start from the case of Gaussian excitatory-and-inhibitory networks, Results Secs. 1.3, 1.4 demonstrate our analysis for approximating local connectivity with low-rank structure. In the following Results Sec. 1.5, we use the low-rank approximation models obtained to describe the nonlinear dynamics. Then, in Results Sec. 1.6 we generalize our analysis to sparse excitatory-and-inhibitory networks. Major novel findings, key insights, restrictions and future extensions are addressed in discussions. Technical details and a retour to the Gaussian-mixture low-rank framework are covered in methods, preceded by a table of notations (Table 1).

1 Results

1.1 Local vs. global representations of random recurrent connectivity

We study networks of N rate units with random recurrent connectivity given by the connectivity matrix J, where the entry Jij corresponds to the strength of the synapse from neuron j to neuron i. A full statistical description of the random connectivity would require specifying the joint distribution P({Jij}) of the N2 synaptic weights. Determining the dynamics from this high-dimensional distribution is however in general intractable. We therefore focus on connectivity models that make simplifying assumptions on the underlying statistics.

Our specific goal is to relate two different classes of such models, which we refer to as the local and the global representations of recurrent connectivity. Both representations assume that the network consists of P populations, and the statistics of connectivity depend only on the pre- and post-synaptic populations. The two representations are however based on different statistical features of the connectivity.

The local representation defines the connectivity statistics by starting from the marginal distributions Prob(Jij = J) of individual synaptic weights, and by including progressively higher-order correlations referred to as connectivity motifs [1, 10, 14]. In this work, we will consider only the first two orders, i. e. the distribution of individual weights and the pairwise correlations ηij between reciprocal connections Jij and Jji that quantify pairwise motifs (Fig 1A). Our key assumption is that both the marginal distributions of Jij and the correlations ηij depend only on the populations p and q that the post- and pre-synaptic neurons belong to: (1)

All synapses connecting the same two populations therefore have identical statistics, leading to a block-like statistical structure for the connectivity matrix J (Fig 1B left panel).

thumbnail
Fig 1. Local vs global representations of recurrent connectivity.

(A) The local representation defines the statistics of synaptic weights Jij by starting from the marginal probability distribution of individual synaptic weights (left) and then specifying reciprocal motifs in terms of correlations ηij between reciprocal weights Jij and Jji connecting neurons i and j (right). Both the marginal distribution and the reciprocal correlations are assumed to depend only on the populations p and q that the neurons i and j belong to. (B) The resulting connectivity matrix J has block-structured statistics, where different blocks correspond to connections between the P different populations (P = 2 in this illustration). It can be decomposed into a superposition of a mean component , and a remaining zero-mean random connectivity component Z that has block-structured variances. (C) The global, low-rank representation defines the connectivity matrix J as the sum of R outer products between connectivity vectors m(r), n(r) for r = 1…R. The statistics of connectivity are defined in terms of the joint probability distribution over neurons i of their entries on connectivity vectors. We specifically consider the class of Gaussian-mixture low-rank models, where each neuron is first assigned to a population p, and within each population the entries on connectivity vectors are generated from a multivariate Gaussian distribution with fixed statistics. Here we illustrate this distribution for one pair of connectivity vectors (R = 1) and P = 2 populations. Each dot represents the connectivity parameters of one neuron i, the red and blue colours denote the two populations, white dots and the rotations of the dot clouds indicate the mean and covariance of the distribution for each population. (D) Relating the local and global representations of recurrent connectivity for a simplified excitatory-inhibitory network. In this model, the mean connectivity depends only on the presynaptic population (indicated by red and blue colours). The mean connectivity is in this case rank-one, and can be written as an outer product of vectors and . We approximate the full connectivity by a rank-one matrix, with connectivity vectors m and n obtained from and using perturbation theory.

https://doi.org/10.1371/journal.pcbi.1010855.g001

The global representation of connectivity instead refers to the situation where J is defined as a low-rank matrix [30, 31, 33]: (2)

Here and for r = 1…R are referred to as connectivity vectors, where R is the rank of J. In this representation, the statistics of connectivity are defined by the distribution of vector elements, rather than directly by the distribution of synaptic weights as in the local representation. Specifically, each neuron i is characterized by its set of entries over the connectivity vectors. For each neuron, these 2R entries are generated from a joint distribution, independently of the other neurons, and the parameters of this joint distribution depend on the population p the neuron belongs to. Here we focus on the broad class of Gaussian-mixture low-rank networks, in which for population p, the joint distribution of elements is a multi-variate Gaussian defined by the means and covariances of the 2R entries [31, 33] (Fig 1C).

To relate the local and the global representations of connectivity, a key observation is that any matrix J generated from the local statistics defined in Eq (1) can be expressed as (3) where contains the mean values of the connections, and Z contains the remaining, zero-mean random part [40]. Because of the underlying population structure (Eq (1)), consists of P × P blocks with identical values within each block (Fig 1B middle panel), and is therefore at most of rank P. The random part Z is instead in general of rank N, but obeys block-like statistics, with variance and normalized covariance parameters defined by P × P matrices (Methods Secs. 2.1.1–2.1.2, Eqs (27) and (32)).

For the sake of simplicity, in this study, we focus on a simplified excitatory-inhibitory model [41, 43]. This network consists of one excitatory and one inhibitory population, so P = 2 and in the following we use the population indices p, q = E, I. A central simplifying assumption in this model is that the mean synaptic weights depend only on the pre-synaptic population, so that and . The mean connectivity matrix therefore consists of only two blocks and is unit rank (Fig 1D). The statistics of the random part Z instead depend on both pre- and post-synaptic populations, and are therefore described by 2 × 2 matrices of variance and normalized covariance parameters (see Methods Sec. 2.1.3, Eq (37)).

1.2 Approximating locally-defined connectivity with low-rank connectivity

To relate the local and global representations of connectivity, we start from a connectivity matrix J generated from the local statistics (Eq (1)) and approximate it by a rank-R matrix of the form given in Eq (2). As the locally-defined connectivity matrix J is of rank N, this is equivalent to the classical low-rank approximation problem, for which a variety of methods exist [30, 31, 33]. Here we use simple truncated eigen-decomposition as it preserves the dominant eigenvalues that determine nonlinear dynamics.

Applying the standard eigenmode decomposition, J can be in general factored as (4) where m(r) and n(r) are rescaled versions of the r-th right and left eigenvectors (Methods Sec. 2.3, Eqs (44)–(49)), ordered by the absolute value of their eigenvalue λr for r = 1…N. A rank-R approximation that preserves the top R eigenvalues can then be obtained by simply keeping the first R terms in the sum in Eq (4). In this study, we focus on R = 1, corresponding to the dominant eigenvalue. Higher rank approximations will be described elsewhere.

Eigenvalues and eigenvectors are in general complex nonlinear functions of the entries of the matrix J. To determine the dominant eigenvalues and the corresponding vectors of J, we capitalize on the observation in Eq (3) that a locally-defined connectivity matrix can in general be expressed as a sum of a low-rank matrix of mean values and the remaining random part Z. Previous studies have found that the eigenspectra of matrices with such structure typically consist of two components in the complex plane: a continuously-distributed bulk determined by the random part, and discrete outliers controlled by the low-rank structure [30, 32, 39, 4446]. In this study, we extend previous approaches to determine the influence of the block-like statistics of Z on the outliers that correspond to dominant eigenvalues. We then use perturbation theory to determine the corresponding left and right eigenvectors and their statistical structure. Here we summarize the main steps of this analysis (full details are provided in Methods), and then apply it to specific cases in the following sections.

We focus on the simplified E-I network for which the mean part of the connectivity is unit rank and can therefore be written as , so that the full connectivity matrix is (5)

The mean part of the connectivity has a unique non-trivial eigenvalue which can give rise to one or several outliers λ in the eigenspectrum of J. To determine how the random part of the connectivity influences λ, we start from the characteristic equation for the eigenvalues of J and exploit Eq (5) to apply the matrix determinant lemma (Eq (59)) and get (6)

As long as (IZ/λ) is invertible, this equation determines the eigenvalues of .

We next assume that the maximal eigenvalue of Z is smaller than the outlying eigenvalues corresponding to the low-rank approximation we aim to determine (Eq (4), Methods Sec. 2.2 Eqs (40) and (41)). This condition holds as long as the variance amplitudes of the random part Z of the connectivity are not too large [15, 18](S4 Text). This assumption allows us to do series expansion which leads to a nonlinear equation for λ [32]: (7)

Although this nonlinear equation is a polynomial with infinite terms, there are at most finite N solutions for the eigenvalue outliers [32]. More specifically, in this work, we are only focusing on the second-order reciprocal motifs in the random component. The second order coefficient θ2 in Eq (7) is the first non-trivial term for the reciprocal case, so we truncate the series summation at k = 2 (included) to provide a more straightforward and understandable comprehension of how reciprocal motifs modify the eigenvalue outlier. In addition, we give the computation of eigenvalue outliers without truncation, and more elaborate explanations in S6 Text.

Truncating the sum to second order yields an approximate third order polynomial for λ: (8)

The statistics of the outlying eigenvalue can then be obtained by averaging over the random part of the connectivity Z.

An approximate expression for the right and left connectivity vectors m and n of J corresponding to the outliers λ can be determined using first order perturbation theory [47], where we treat the variance amplitudes of the random part Z of the connectivity as small parameters. We first note that and are the right- and left-eigenvectors of corresponding to the non-trivial eigenvalue λ0. Interpreting the full connectivity matrix J as perturbed by a random matrix Z, at first order m and n can be expressed as (9) with (10)

A key observation is that each element of Δm and Δn is a sum of N random variables. If the elements {zij} in Z are random samples drawn from a distribution with overall mean and finite variance, the central limit theorem holds and therefore predicts that, in the limit of large N, the statistics of Δmi and Δni, and therefore mi and ni, follow a Gaussian distribution. In general, the mean and variance of mi and ni and their correlation are determined by the mean, variance and correlation of the elements of J, but not the specific form of the probability distribution. Since the matrix Z has block-like statistics determined by the population structure, the statistics of the resulting mi and ni depend on the population p the neuron i belongs to. Overall, the distribution of elements of m and n obtained from perturbation theory therefore follow a Gaussian-mixture distribution, so that our approach effectively approximates a locally-defined J by a Gaussian-mixture low-rank model specified by the means , the variances and the covariances of the entries on the connectivity vectors for p = E, I.

We next apply the perturbative approach described here to networks with independent random components, and then to networks with reciprocal motifs.

1.3 Low-rank structure induced by independently generated synaptic connections

We first apply our approach for a low-rank approximation to the simplest version of the locally-defined excitatory-inhibitory network where each Jij is generated independently from a Gaussian distribution with a mean that depends only on the pre-synaptic population, i. e. with p, qE, I. The entries of the eigenvectors and of the mean connectivity matrix are then given by: (11) (12) (13)

In the large network limit, averaging over Z in Eq (7) yields [θk] = 0 for all k > 0 [32], so that the outlier is on average given by [λ] = θ0 = λ0 (Fig 2A). Our approach moreover gives an expression for the standard deviation of the outlier in the finite-size network, which grows linearly with g (Fig 2B, Eq (80)). Examining the entries of the left and right eigenvectors n and m of J corresponding to the outlier, as expected, the distribution of (mi, ni) is well described by a mixture of two Gaussians centred at (Fig 2C). We further find that perturbation theory accurately accounts for the individual entries of the eigenvectors as long as g is sufficiently below unity (Fig 2D). Perturbation theory provides a lower bound for the values of the corresponding variances. For large values of g, the distributions remain Gaussian, but their variances increase above the predictions of perturbation theory (Fig 2E). The reason for this deviation from the theory is that perturbation theory assumes a small value of the scaling factor g in the variance of Jij; as a result, the deviation also reflects the systematic error resulting from the first-order approximation. Importantly, the entries of the left and right eigenvectors are uncorrelated, and only their means, but not their variances, differ between the two populations.

thumbnail
Fig 2. Eigenvalues and dominant eigenvectors for locally-defined Gaussian connectivity with independent synaptic weights.

(A) Eigenvalue spectra of excitatory-inhibitory connectivity matrices J with elements generated from Gaussian distributions with identical variances g2/N over neurons. The coloured dots in the circular bulk shows 600 eigenvalues for one realization of the random connectivity for each value of g. Different colours correspond to different values of g. Dashed envelopes indicate the theoretical predictions for the radius rg of the circular bulk computed according to Eqs. (147), (148). Outlying eigenvalues are shown for 30 realizations of the random connectivity, and for different g their location on the y-axis is shifted to help visualization, the dispersion reflects finite-size effects. The red arrow points to the eigenvalue λ0 of the mean connectivity matrix . (B) Statistics of outlying eigenvalues over realizations of random connectivity. Empirical distribution (the red area shows mean ± standard deviation and reflects finite-size effects), compared with the theoretical predictions for the mean (black dashed line) and standard deviation (gray dashed line) obtained using Eq (80). (C) Scatter plot showing for each neuron i its entry ni on the left eigenvector against its entry mi on the right eigenvector. Red and blue colours represent respectively excitatory and inhibitory neurons. The white dots and the dashed lines respectively indicate the means and covariances for each population obtained from simulations. (D) Comparison between eigenvector entries obtained from direct eigen-decomposition of J with projections obtained using perturbation theory (Eqs (9) and (10)) in a given realization of Z. (E) Comparison between simulations (dashed lines) and theory (full lines) for the variances , of eigenvector entries corresponding to different populations (Eq (90)). To help visualization, the curves for the excitatory population are thicker than the curves for the inhibitory population. (F-J) Identical quantities for connectivity matrices in which the variance parameters are heterogeneous: gEE : gEI : gIE : gII = 1.0 : 0.5 : 0.2 : 0.8, gEE increases from 0 to 1. Other network parameters NE = 4NI = 1200 and JE = 2.0, JI = 0.6 in all simulations.

https://doi.org/10.1371/journal.pcbi.1010855.g002

We next turn to the case where the variances of synaptic weights depend on the pre- and post-synaptic populations q, p, and are given by . In that case, the entries of the random part of the connectivity Z are independent, but not identically distributed Gaussians. Previous studies [11, 44] have shown that the spectrum of Z remains circularly symmetric, but its radius rg is determined by a combination of variance parameters gpq (S4 Text, Eqs. (147), (148)). Examining the resulting connectivity matrix J, we found that the results for the uniform case directly extend to this heterogeneous situation. The eigenspectrum of J still consists of an independent superposition of the spectra of Z and (Fig 2F). In particular, the random part of the connectivity does not modify the average value of the outlier, but only impacts its variance, which now depends on a combination of the variances gpq (Fig 2G, Eq (90)). Similarly to the uniform case, the distribution of the entries of the left and right eigenvectors is well described by a mixture of two Gaussians, with variances predicted by perturbation theory. The entries of the left and right eigenvectors are uncorrelated, but now both their means and variances depend on the population the neuron belongs to (Fig 2H–2J).

In summary, when synaptic connections Jij are generated independently across pairs of neurons, the equivalent global representation is a Gaussian-mixture low-rank model where the entries of the structure vectors are independent with mean values determined by the low-rank structure of the mean connectivity matrix . Although the mean connectivity establishes the network’s basic structure, the Gaussian-mixture low-rank approximation improves on it in terms of preserving connectivity fluctuations and reflecting the cell-type-dependent structural variances in the original locally-defined random connectivity part Z. Importantly, in that situation, the dominant outlying eigenvalues of J are on average identical to those of , that is, [λ] = λ0.

1.4 Low-rank structure induced by reciprocal motifs

We next turn to locally-defined excitatory-inhibitory networks with reciprocal connectivity motifs quantified by the correlation ηij between reciprocal synaptic weights Jij and Jji. We assumed that these reciprocal correlations are identical for any pair of neurons i and j belonging to a given pair of populations p and q, and used the corresponding parameters ηpq to generate the connectivity matrix J (Methods Sec. 2.1.2). Within the decomposition of J in a mean and random part Z (Eq (24)), the additional reciprocal correlations affect only the statistics of Z.

We first consider the homogeneous case where the reciprocal correlation is identical across all populations, i. e. ηpq = η (Fig 3). Previous studies have shown that a random matrix Z with zero mean and reciprocal correlations η has a continuous spectrum that is deformed from a circle into an ellipse as η is increased [18, 48]. Superpositions between correlated random matrices, and low-rank structure such as have, to our knowledge, not been previously studied. Inspecting the eigenspectrum of , we find that it still consists of a continuous bulk and discrete outliers (Fig 3A). The continuous bulk is contained in an ellipse in the complex plane identical to the spectrum of Z, as in the uncorrelated case. In contrast, we find that the outliers deviated from the eigenvalues of as η is increased (Fig 3B). These deviations are well captured by our analytic approach summarized in Eq (8). Indeed, when averaging Eq (8) over Z, reciprocal correlations generate a non-zero due to Z2. This term leads to a cubic equation in Eq (8) and therefore has two effects. First, the non-zero θ2 induces deviations of the outliers from the eigenvalue λ0 of . The direction of these deviations is positive if excitation dominates (λ0 = JEJI > 0) and negative if inhibition dominates (λ0 = JEJI < 0, Fig 3G–3I). Second, the cubic equation can have up to three solutions and therefore potentially generates additional outliers, and in particular complex conjugate ones. Whether these additional outliers are observed depends on the accuracy of the third-order approximation (Eq (8)) to the determinant lemma (Eq (59)), and on the norm of these outliers compared to the spectral radius, in both scenarios where the network has a homogenous variance g2/N (Fig 3A and 3G) and when gpq differ between populations (Fig 3J). Please refer to S6 Text for a more thorough discussion.

thumbnail
Fig 3. Eigenvalues and dominant eigenvectors for locally-defined Gaussian connectivity with homogeneous reciprocal correlations.

(A) Eigenvalue spectra of excitatory-inhibitory connectivity matrices J, with homogeneous reciprocal correlations η. Different colours from top to bottom correspond to networks with different values of η. The dots in the elliptical bulk show 600 eigenvalues for one realization of the random connectivity. Outlying eigenvalues are shown for 30 realizations of the random connectivity, the dispersion reflects finite-size effects. The red arrow on the top points to the eigenvalue λ0 of the mean connectivity . Coloured circles are the eigenvalues predicted using determinant lemma and truncated series expansion (Eqs (7) and (8), truncating at k = 2), coloured triangles are the eigenvalues predicted using determinant lemma without finite truncation (see S6 Text, Eqs. (153)-(157)). (B) Comparison of the eigenvalues from the finite-size simulation with the predictions of the determinant lemma as the reciprocal correlation η is increased. The coloured solid lines show the roots of the third-order polynomial in Eq (8) (truncated series expansion). The light purple area indicates the empirical distribution of the dominant outlier for 30 realizations, reflecting finite-size effects, while the black dashed line is the unperturbed eigenvalue λ0. The grey areas represent the areas covered by the eigenvalue bulk. (C) Scatter plot showing for each neuron i its entry ni on the left eigenvector against its entry mi on the right eigenvector. Red and blue colours represent respectively excitatory and inhibitory neurons. The white dots and the dashed lines respectively indicate the means and covariances for each population. (D) Comparison between eigenvector entries obtained from direct eigen-decomposition of J with projections obtained using perturbation theory (Eqs (9) and (10)) in a given realization of Z. (E) Comparison between simulations (coloured areas, finite-size effects) and theoretical predictions (coloured lines, Eq (97)) for the population covariance of the entries on the left and right connectivity eigenvectors to different populations. (F) Comparison of the overall covariance σnm (Eq (72)) with the deviation Δλ of the dominant outlying eigenvalue from the unperturbed value λ0. Empirical covariance (gradient blue area reflects finite-size effects, where the colour depth represents η) compared with the theoretical prediction (black line) obtained using Eqs (97) and (92). The x-axis uses the theoretical prediction of the deviation of the eigenvalue λ from λ0. Other network parameters: JE = 2.0, JI = 1.2, NE = 4NI = 1200 and homogeneous variance parameters gpq = g = 0.3. (G-I) Same as (A, B, F) for an inhibition dominates connectivity matrix where JI = 2.0, JE = 1.2, with homogeneous reciprocity η and variance parameters g = 0.3. (J-M) Same as (A-C) and (E) for excitatory-and-inhibitory connectivity matrices with homogeneous reciprocal correlations η but cell-type-dependent variance parameters gEE : gEI : gIE : gII = 1.0 : 0.5 : 0.2 : 0.8 and gEE = 0.3. In (B, E, F, H, I, K, M), the departure of the centres of the numerical simulation results from the theoretical predictions reflect the systematic errors due to the first-order perturbation approximation of eigenvalues and eigenvectors.

https://doi.org/10.1371/journal.pcbi.1010855.g003

We next examine the right- and left-eigenvectors m and n corresponding to the dominant outlier. Analogous to the uncorrelated case in the networks with independent connections, perturbation theory accounts for the individual entries of these vectors from a specific realization of the locally-defined random connectivity Z, and these individual entries mi and ni exhibit Gaussian-mixture statistics (Fig 3C and 3D, Eq (10), and Methods Sec. 2.3 Eqs (69) and (70)). Unlike in the uncorrelated case, reciprocal correlations now induce correlations between Δmi and Δni (Methods Sec. 2.5.2). Indeed, perturbation theory predicts that the first-order effects Δm and Δn of the random connectivity on m and n are respectively determined by Z and its transpose Z (Eq (10)). Reciprocal correlations between zij and zji directly lead to correlations between Z and Z and therefore a non-zero covariance σnm between elements of m and n, that can be predicted by mean field theory (Fig 3E, Eq (72)). The strength of the covariance between eigenvector entries reflects the strength of the additional feedback loop due to reciprocal correlations, and is therefore directly related to the deviations of the outlying eigenvalue from the uncorrelated value λ0 (Eqs (42) and (96), Fig 3F). When the network has both homogeneous variance parameters and correlation parameters, the excitatory and inhibitory populations have the same covariance (Eq (97)). If the synaptic variances gpq differ across populations, the covariances σnm are different for excitatory and inhibitory populations even if the reciprocal correlations are uniform (Fig 3L and 3M).

These results directly extend to networks with heterogeneous reciprocal correlations ηpq, p, q = E, I (Fig 4). Finite-size simulations in this circumstance show the existence of additional, complex conjugate outliers accurately predicted by the cubic term in Eq (8) (Fig 4G, coloured circles and triangles are overlapping, containing outlier scatters at conjugate positions), and this is particularly true in the case where the impact of higher-order structures is marginal compared to that of structures up to the second-order. Moreover, the covariances between the entries of low-rank connectivity vectors in this case differ between the excitatory and inhibitory population.

thumbnail
Fig 4. Eigenvalues and dominant eigenvectors for network connectivity matrix with heterogeneous reciprocal correlations.

(A) Eigenvalue spectra of excitatory-inhibitory connectivity matrices J, with homogeneous variance parameter g = 0.3 but cell-type-dependent reciprocal correlations ηEE = ηEI = −ηII > 0, and ηEE increasing from 0 to 1. Different colours from top to bottom correspond to networks with different values of ηEE. The dots in the elliptical bulk show 600 eigenvalues for one realization of the random connectivity. Outlying eigenvalues are shown for 30 realizations of the random connectivity, the dispersion reflects finite-size effects. The red arrow on the top points to the eigenvalue λ0 of the mean connectivity . Coloured circles are the eigenvalues predicted using determinant lemma and truncated series expansion (see Eqs (7) and (8), truncating at k = 2), coloured triangles are the eigenvalues predicted using determinant lemma without finite truncation (see S6 Text, Eqs. (153)-(157)). (B) Comparison of the eigenvalues from the finite-size simulation with the predictions of the determinant lemma as the reciprocal correlation ηEE (−ηII) is increased. The coloured solid lines show the roots of the third-order polynomial in Eq (8) (truncated series expansion). The light purple area indicates the empirical distribution of the dominant outlier for 30 realizations, reflecting finite-size effects; while the black dashed line is the unperturbed eigenvalue λ0. The grey areas represent the areas covered by the eigenvalue bulk. (C) Scatter plot showing for each neuron i its entry ni on the left eigenvector against its entry mi on the right eigenvector. Red and blue colours represent respectively excitatory and inhibitory neurons. The white dots and the dashed lines respectively indicate the means and covariances for each population. (D) Comparison between eigenvector entries obtained from direct eigen-decomposition of J with projections obtained using perturbation theory (Eqs (9) and (10)) in a given random connectivity Z. (E) Comparison between simulations (coloured areas, finite-size effects) and theoretical predictions (coloured lines, Eq (97)) for the population covariance of the entries on the left and right connectivity eigenvectors to different populations. (F) Comparison of the overall covariance σnm (Eq (72)) with the deviation Δλ of the dominant outlying eigenvalue from the unperturbed value λ0. Empirical covariance (gradient blue area reflects finite-size effects, where the colour depth stands for η) compared with the theoretical prediction (black line) obtained using Eqs (97) and (92). The x-axis uses the theoretical prediction of the deviation of the eigenvalue λ from λ0. (G-L) Same as (A-F) for a connectivity matrix with heterogeneous reciprocal correlations: ηEE = −ηEI = −ηII > 0, and ηEE increasing from 0 to 1. In (B, E, F, H, K. L), the departure of the centres of the numerical simulation results from the theoretical predictions reflect the systematic errors due to the first-order perturbation approximation of eigenvalues and eigenvectors. Other network parameters: NE = 4NI = 1200 and homogeneous variance parameters gpq = g = 0.3 in all simulations, JE = 2.0, JI = 1.3 for networks in (A-F), JE = 2.0, JI = 1.4 for networks in (G-L).

https://doi.org/10.1371/journal.pcbi.1010855.g004

1.5 Approximating low-dimensional dynamics for locally-defined connectivity

In previous sections, we developed a rank-one approximation of locally-defined excitatory-inhibitory connectivity. Here we use this approximation to describe the resulting low-dimensional dynamics. We consider networks of rate units, where the activation xi of unit i obeys (14)

Here ϕ(x) = 1 + tanh(xθ) is a positive transfer function, and for simplicity, we focus on autonomous dynamics without external inputs. We start from a locally-defined excitatory-inhibitory connectivity matrix, and compare the resulting activity with the theoretical predictions of our rank-one approximation, for which the dynamics are low-dimensional and analytically tractable. We first summarize the theoretical predictions for those dynamics, and then examine the specific cases of independent and reciprocally-correlated connectivity.

Recent works have showed that in networks with a rank R connectivity matrix, the trajectories x(t) = {xi(t)}i=1…N are confined to a low-dimensional subspace of the N − dimensional space describing the activity of all units [3033]. In absence of external inputs, this subspace is R-dimensional and spanned by the set of connectivity eigenvectors m(r) for r = 1…R, so that the trajectories can be parametrized as where κr is a collective latent variable representing activity along m(r). For a rank-one (R = 1) connectivity corresponding to an approximation of our locally-defined E-I network, the dynamics can therefore be represented by a single latent variable κ, so that the activation of unit i is given by (15) (16) where we inserted the expression for mi obtained from a first-order perturbation (Eqs (9) and (10)). Note that since , the first term in the r. h. s. of Eq (16) corresponds to population-averaged mean activity , while the second term is the deviation of the activity of unit i from the population average, which statistically leads to the population-averaged variance of neuronal activations (Methods Sec. 2.6.1, Eq (111)) (17)

By inserting the values for Δmi obtained using specific realizations of random connectivity in Eq (10) into Eq (16), the rank-one approximation accounts for the heterogeneous firing activity of single units coming from the synaptic weights in specific instances of locally-defined networks Z. Moreover, the rank-one theory predicts that both the population-averaged mean and the standard deviation of activations in the network are proportional to κ.

The values taken by the latent variable κ can be determined by projecting Eq (14) onto m and inserting Eq (16) (Methods Secs. 2.6.1, 2.6.2). This leads to a closed equation for the dynamics of κ(t): (18) where the brackets denote a Gaussian average (see Eq (112)), αp is the fraction of neurons belonging to population p, is the population covariance between elements mi and ni with i belonging to population p (Table 1). The steady state then obeys (19) where (20)

The two terms in the r. h. s. of Eq (19) show that the contributions of recurrent synaptic inputs to the latent dynamics κ come from two sources: (i) the population means of the left and right connectivity eigenvectors and that contribute to Fmean(κ) (Eqs (54) and (56)); (ii) the covariance between the left and right connectivity eigenvectors that contributes only to Fcov(κ). In the low-rank approximation (Eqs (2) and (41)) of the locally-defined E-I connectivity (Eq (5)), these two terms have distinct origins: the mean comes from the independent components of the connectivity (Eqs (55) and (56)); while the covariance comes from reciprocal correlations between connections (Eqs (96) and (97)). We next examine separately the effects on dynamics of these two connectivity components.

1.5.1 Independently generated local connectivity.

When synaptic connections are generated independently from a Gaussian distributions based on the identities of pre- and post-synaptic populations, the rank-one approximation of connectivity leads to uncorrelated left and right connectivity vectors n and m, so that for p = E, I. In consequence, only the first term is present in the r. h. s. of Eq (19), and the fixed point of the latent dynamics is given by a difference between excitatory and an inhibitory feedback (Eq (116)): (21)

As long as the mean inhibition JI is strong enough to balance the mean excitation JE, Eq (21) predicts a single fixed point. As JE is increased, positive feedback begins to dominate and leads to a bifurcation to a bistable regime for the latent dynamic variable κ (Fig 5A–5C, S2 Text).

thumbnail
Fig 5. Predicting low-dimensional dynamics using a rank-one approximation of networks with independent Gaussian connectivity.

(A) Fixed points of the latent variable κ in the rank-one approximation. The lines show the dynamics as function of κ, predicted by Eq (18) (solid line: JE = 2.4; dashed line: JE = 1.5). The intersections with y = 0 correspond to fixed points (filled dots: stable; unfilled dot: unstable). (B) Contribution of mean connectivity to the latent dynamics, Fmean(κ) in Eq (20), for two values of JE. (C) Bifurcation diagram for increasing JE: analytical predictions of Eq (21) compared with simulations of the full network with locally-defined connectivity. orange line: analytical prediction including only the mean part of the connectivity ( in Eq (21)); purple line: analytical prediction including the first-order perturbation term in the rank-one approximation; gray: projection of simulated activity x onto the connectivity vector m computed by perturbation theory for 30 realizations of random connectivity Z, the shaded area reflects finite-size effects (Eq (104)). (D) Comparison between simulations and mappings obtained from the low-rank approximation (Eqs (10) and (16)) for the activity of individual units in a given realization of the random connectivity Z. For each unit i, a dot shows the deviation Δxi of its steady-state activity from the population average, against its value Δmi of the perturbed part of the connectivity vector m (Eq (16)). The low-rank theory predicts Δxi = κΔmi. Orange, cyan and gray scatters show excitatory or inhibitory populations, each for several values of JE. Lines represent y = κx, where κ is obtained from Eq (21). Upper panels show the result in a realization with a high fixed point, bottom panels show the result in a realization with a low fixed point. (E) Comparison between the predictions (solid lines) and simulations (shaded areas) for the population-averaged variances of Δxi. Shaded areas show mean±std and reflect finite-size effects. In C, E, the departure of the centres of the numerical simulation results from the theoretical predictions reflect the systematic errors due to the approximation of the latent dynamical variable κ as well as the low-rank connectivity statistics. Other network parameters: NE = 4NI = 1200, JI = 0.6, gEE: gEI: gIE: gII = 1.0: 0.5: 0.2: 0.8 and gEE = 0.8. The transfer function ϕ has parameter θ = 1.5.

https://doi.org/10.1371/journal.pcbi.1010855.g005

This bistability due to positive feedback is expected on the basis of mean connectivity alone. Indeed, replacing the connectivity matrix by its mean is equivalent to a rank-one approximation with and which lead to Eq (115) with for p = E, I, this reduced model actually corresponds to the classic Amari-Grossberg rate model [4952]. The additional first-order perturbation term in the rank-one approximation (Eqs (9) and (10)) additionally takes into account fluctuations in the connectivity, which leads to a non-zero , and modifies the fixed points predicted by Eq (21). In consequence, the bifurcation to bistability takes place at higher values of JE than predicted from mean connectivity alone (purple lines compared to orange lines in Fig 5C).

More importantly, we find that the first-order perturbation in the rank-one approximation accurately accounts for the heterogeneous firing rates of individual neurons for specific instances of the random, locally-defined connectivity Z (Eqs (16) and (69), Fig 5D), and therefore predicts well the variance across neurons of the steady state of population dynamics . In particular, cell-type dependent variances in the synaptic connectivity, lead to distinct variances and for excitatory and inhibitory populations (Eqs (90) and (130), Fig 5E).

Note that the independently generated local connectivity can be treated analytically without resorting to a rank-one approximation, by using a different variant of mean-field theory originally developed for randomly connected networks. [30, 53]. That theory is not perturbative, and takes into account an additional term in the variance (see Methods Sec. 2.6.3 and S3 Text for more details). However, in contrast to the rank-one approximation, it does not predict the activity of individual neurons, and is challenging to extend beyond independent random connectivity.

1.5.2 Reciprocal motifs.

We next turn to the predictions of the rank-one approximation for dynamics resulting from locally-defined connectivity with reciprocal motifs. In this case, the additional reciprocal correlations in the random part of the connectivity lead to a non-zero covariance σnm between the connectivity vectors n and m in the rank-one approximation (Eqs (92) and (97)). This covariance in turn generates an additional feedback component in the dynamics of the latent variable, the second term in the r. h. s. of Eq (19).

Specifically, the excitation-dominated dynamical regime is largely determined by the noiseless mean connectivity , but beyond this mean approximation, positive reciprocal correlations combined with the excitation-dominated connectivity enhance positive feedback with respect to mean connectivity alone (Methods Sec. 2.5.2, Eq (97)). As a result, progressively increasing the reciprocal correlations can therefore induce a bifurcation to bistability, even if the mean excitation provided by the mean approximation is not sufficient by itself to support two stable states (Fig 6A–6C). This is a major novel effect of reciprocal motifs on collective dynamics. As in the case of independent connectivity, we moreover find that the perturbative term in the rank-one approximation accounts for the heterogeneous activity of individual neurons in specific realizations of the connectivity Z (Fig 6D, Eqs (16) and (69)), and therefore also predicts the cell-type dependent variances of activity (Fig 6E).

thumbnail
Fig 6. Predicting low-dimensional dynamics using a rank-one approximation of networks with homogeneous reciprocal motifs.

(A) Influence of reciprocal correlations on fixed points of the latent variable κ in the rank-one approximation. The lines show the dynamics as function of κ, predicted by Eq (18) (solid line: η = 1; dashed line: η = 0). The intersections with y = 0 correspond to fixed points (filled dots: stable; unfilled dot: unstable). (B) Comparison of the contributions of mean connectivity Fmean(κ) and covariance Fcov(κ) to the latent dynamics of κ (Eq (20)) for η = 1. (C) Bifurcation diagram for increasing η at fixed JE. Solid purple lines: analytical predictions of Eqs (19) and (20); gray areas: projection of simulated activity x onto the connectivity vector m computed by perturbation theory (Eq (104)) for 30 realizations of random connectivity Z, the shaded area reflects finite-size effects. (D) Comparison between simulations and mappings obtained from the low-rank approximation for the activity of individual units in a given realization of the random connectivity Z. For each unit i, a dot shows the deviation Δxi of its steady-state activity from the population average, against its value Δmi of the perturbed part of the connectivity vector m (Eq (16)). The low-rank theory maps Δxi = κΔmi. Orange, cyan and gray scatters show excitatory or inhibitory populations, each for two values of η. Lines represent y = κx, where κ is obtained from Eqs (19) and (20). Upper panels show the result in a realization with a high fixed point, bottom panels show the result in a realization with a low fixed point. (E) Comparison between the predictions (solid lines, Eq. (134)) and simulations (shaded areas) for the population-averaged variances of Δxi. Shaded areas show mean±std and reflect finite-size effects. In C, E, the departure of the centres of the numerical simulation results from the theoretical predictions reflect the systematic errors due to the approximation of the latent dynamical variable κ as well as the low-rank connectivity statistics. Other network parameters: NE = 4NI = 1200, JE = 1.9, JI = 0.6, gEE: gEI: gIE: gII = 1.0: 0.5: 0.2: 0.8 and gEE = 0.8. The transfer function ϕ has parameter θ = 1.5.

https://doi.org/10.1371/journal.pcbi.1010855.g006

More generally, our rank-one approximation allows us to describe the latent dynamics when the degree of reciprocal correlation depends across the pre- and post-synaptic populations (Results Sec. 1.4). Such heterogeneity in reciprocal correlations can enhance different types of feedback. For example, antisymmetric connectivity within inhibitory populations (ηII < 0) disinhibits excitatory population and thus facilitates bistable transitions (Fig 7A–7C) compared to networks with homogeneous reciprocal correlations. In contrast, excitation-dominated connectivity with homogeneous negative reciprocity (η < 0) generate negative feedback and therefore suppress the global dynamics from bistable state to quiescent (Fig 7D–7F).

thumbnail
Fig 7. Predictions for low-dimensional dynamics using a rank-one approximation of networks with non-homogeneous and anti-symmetric reciprocal motifs.

(A-D) Fixed points of latent dynamics in networks with heterogeneous, cell-type-dependent reciprocal correlations: ηEE = ηEI = −ηII. Network parameters: NE = 4NI = 1200, gEE: gEI: gIE: gII = 1.0: 0.5: 0.2: 0.8, gEE = 0.8, JE = 1.9 and JI = 0.6. (E-H) Fixed points of latent dynamics in networks with homogeneously anti-symmetric motifs, ηpq = η ∈ [−1, 0]. Network parameters: NE = 4NI = 1200, gEE: gEI: gIE: gII = 1.0: 0.5: 0.2: 0.8, gEE = 0.8, JE = 2.2 and JI = 0.6. In C and F, gray areas show projections of simulated activity x onto the connectivity vector m computed by perturbation theory (Eq (104)), shaded areas show mean±std and reflect finite-size effects. In C, F the departure of the centres of the numerical simulation results from the theoretical predictions reflect the systematic errors due to the approximation of the latent dynamical variable κ as well as the low-rank connectivity statistics.

https://doi.org/10.1371/journal.pcbi.1010855.g007

Importantly, describing the role of reciprocal correlations on latent dynamics relies on our global low-rank approximation of locally-defined connectivity. In particular, the effects of such correlations cannot be captured by considering only mean connectivity and population-averaged activity (first term in the r. h. s. of Eq (19)). Moreover, including reciprocal correlations in classical mean-field approaches to randomly connected networks is technically challenging [18].

1.6 Extension: E-I networks with sparse connectivity

In previous sections, we examined locally-defined connectivity generated using Gaussian distributions of individual synaptic weights (function fpq in Eq (1)). Our results for the low-rank approximation of locally-defined connectivity are however independent of the precise form of the distribution fpq. In particular, our finding that the resulting low-rank structure obeys Gaussian-mixture statistics is universal, in the sense that it is valid for any distribution fpq for which the central limit theorem holds (see Discussion). To illustrate this universality, here we turn to networks with sparse connectivity, generated from Bernoulli distributions fpq taking values 0 and Aq (where Aq = AE, AI refer to strengths of excitatory and inhibitory connections), with a uniform fraction c of non-zero connections (see Eq (28)). In this case, the variance of the synaptic strengths Jij scales as 1/N, as we assumed for Gaussian connectivity. The statistics of the corresponding rank-one approximation are fully determined by the mean, variance and covariance of synaptic weights in the excitatory and inhibitory populations. Here we compare the predictions of our perturbative approximation with direct simulations of full-rank networks with locally-defined connectivity. We first consider independently generated connectivity, and then turn to reciprocal motifs.

1.6.1 Independently generated sparse connectivity.

For networks with independently generated sparse connectivity, the mean and variance of individual synaptic weights are given by cAq and , where q = E, I refers to the population of the presynaptic neuron. Previous works have shown that for such sparse networks where the variances of synaptic weights scale as 1/N, the bulk of the eigenvalues are distributed within a circle in the complex plane with a radius rg determined by Eq. (149) (Fig 8A) [54, 55], as expected from the universality theorem for random matrices [56]. The overall mean of the synaptic weights instead determines the mean outlying eigenvalue (Eq (78), Fig 8A). In sparse networks, the main novelty with respect to the Gaussian case is that the mean and variance of synaptic weights are not independent parameters, but are instead both set by the synaptic strengths AE and AI as well as the network’s sparsity c (Eqs (38) and (39)). In consequence, varying these couplings changes both the radius of the bulk and the outlying eigenvalue, and can lead to intersections where the outliers dip into the bulk (see S4 Text for details).

thumbnail
Fig 8. Rank-one approximation and predicted low-dimensional dynamics for sparse excitatory-inhibitory networks.

(A) Left: Comparison of the predicted eigenvalue outlier λ0 = c(NEAE + NIAI) (black line) with finite-size simulations (red area shows mean±std for 30 realizations and reflects finite-size effects). The gray area represents the area covered by the eigenvalue bulk. Right: example spectrum of one realization of connectivity matrix with E/I ratio AE/AI = 0.33, and the radius rg of the eigenvalue bulk computed from the statistically equivalent Gaussian connectivity (see S4 Text). (B) Scatter plot showing for each neuron i its entry ni on the left eigenvector against its entry mi on the right eigenvector. Red and blue colours represent respectively excitatory and inhibitory neurons. The white dots and the dash lines respectively indicate the means and covariances for each population obtained from simulations. For visualization purposes, the x− and y-axis are scaled unequally. (C) Comparison between eigenvector entries obtained from direct eigen-decomposition of J with projections obtained using perturbation theory (Eqs (9) and (10)) in a given realization of the sparse connectivity J. (D) Bifurcation diagram for increasing the ratio AE/AI: analytical predictions of Eq (117) compared with simulations of the full network with locally generated sparse connectivity. Purple line: analytical prediction including the first-order perturbation term in the rank-one approximation; gray: projection of simulated activity onto the connectivity vector m computed by perturbation theory Eq (104) for 30 realizations of sparse connectivty J, the shaded area reflects finite-size effects. (E) Comparison between the predictions (solid lines) and simulations (shaded areas) for the population-averaged variances of Δxi. Shaded areas show mean±std and reflect finite-size effects. (F) Comparison between simulations and mappings obtained from the low-rank approximation for the activity of individual units in a given realization of the sparse connectivity J. For each unit i, a dot shows the deviation Δxi of the steady-state activity from the population average, against the corresponding value Δmi of the perturbed part of the connectivity vector m (Eq (16)). The low-rank theory maps Δxi = κΔmi. Orange, cyan and gray scatters show excitatory or inhibitory populations, each for two values of the ratio AE/AI. Lines represent y = κx, where κ is obtained from Eqs (19) and (20). Upper panels show the result in a realization with a high fixed point, bottom panels show the result in a realization with a low fixed point. The gray vertical dashed line in A left, D, E correspond to the critical point c at which the absolute value of the outlier is equal to the radius of the eigenvalue bulk. In A, D, E the departure of the centres of the numerical simulation results from the theoretical predictions reflect the systematic errors due to the approximation of the low-rank connectivity statistics as well as the latent dynamical variable κ. Network parameters: NE = 4NI = 800, c = 0.3, AE = 0.025. The transfer function ϕ has parameter θ = 1.5.

https://doi.org/10.1371/journal.pcbi.1010855.g008

As expected, as long as the outlier lies outside of the eigenvalue bulk, the statistics of entries on the resulting low-rank approximating eigenvectors are well described by a Gaussian-mixture distribution with parameters fully determined by the mean and variance of the synaptic weights (Eq (91), Fig 8B). Our perturbative approximation Eq (69) as well, accounts for the individual entries of the right- and left-eigenvectors corresponding to the outlier from the specific realization of the fluctuation matrix Z (Fig 8C).

Our predictions for the low-dimensional dynamics based on the rank-one approximation therefore directly extend to sparse networks. Comparing with direct simulations, we found that Eq (117) predicts well the global latent variable κ obtained by projecting the activity x onto the approximated rank-one eigenvector m (Eq (104)). As the E/I ratio is increased, positive feedback increases, and the latent variable κ undergoes a transition from a single fixed point to two bistable states (Fig 8D).

From the statistics of the right eigenvector m (Eq (91)), our analysis predicts the heterogeneity of activity in terms of population-averaged variances (Fig 8E). This heterogeneity is identical in excitatory and inhibitory populations, as their right eigenvectors mE, mI have identical fluctuations (Fig 8B and 8C). For individual realizations of the sparse connectivity, the rank-one approximation x = κm moreover captures the activation xi of individual neurons in the specific simulation instances (Fig 8F).

1.6.2 Sparse EI networks with reciprocal motifs.

For sparse E-I networks, we generate reciprocal motifs by introducing a fraction ρpq of reciprocally connected pairs of neurons. Together with the sparsity c and the synaptic strengths, the parameter ρpq determines the cell-type dependent reciprocal correlation ηpq (Methods Sec. 2.1.2 Eqs (34) and (39), Fig 9A).

thumbnail
Fig 9. Characterizing connectivity statistical properties and low-dimensional dynamics for the sparse network with reciprocal motifs.

(A) Schematics of a sparse EI network with four forms of paired connections. White and black rectangles represent the non-zero excitatory and inhibitory sparse connections. (B) Eigenvalue spectrum of the sparse connectivity (upper panel) and from the equivalent Gaussian connectivity (bottom panel) with reciprocal motifs. Cell-type-dependent reciprocal correlations are ηEE = 0.71, ηEI = −0.71, ηII = −0.43 in both connectivity matrices, continuous eigenvalue bulks show eigenvalues for one realization of the network connectivity. Red arrows point to the unperturbed eigenvalue λ0. Outlying eigenvalues are shown for 30 realizations of the network connectivity. Coloured circles are the eigenvalues predicted using determinant lemma and truncated series expansion (see Eqs (7) and (8), truncating at k = 2), coloured triangles are the eigenvalues predicted using determinant lemma without finite truncation (see S6 Text, Eqs. (153)-(157)). (C) Comparison of the eigenvalues from the finite-size simulation of the sparse connectivity, with the predictions of the determinant lemma as progressively increasing the reciprocal correlation ηEE (−ηEI). The coloured solid lines show the roots of the third-order polynomial in Eq (8). The purple area indicates the empirical distribution (mean±std, finite-size effects) of the dominant outlier for 30 realizations of sparse connectivity J, while the black dashed line is the eigenvalue λ0 of the corresponding independent sparse connectivity matrix (Eq (78)). The gray areas correspond to the areas covered by the eigenvalue bulk. (D) Scatter plot showing for each neuron i its entry ni on the left eigenvector against its entry mi on the right eigenvector. Red and blue colours represent respectively excitatory and inhibitory neurons. The white dots indicate the means for each population obtained from simulations. For visualization purposes, the x- and y-axis are scaled unequally. (E) Comparison between eigenvector entries obtained from direct eigen-decomposition of J with projections obtained using perturbation theory (Eqs (9) and (10)) in a given realization of the sparse connectivity J. (F) Comparison between the population covariance of the entries on the left and right connectivity eigenvectors to different populations (coloured areas, finite-size effects) and the predictions of perturbation theory (coloured lines, Eq (99)). (G) Bifurcation diagram for increasing the reciprocal correlation ηEE (−ηEI): analytical predictions of Eq (119) compared with simulations of the full network with locally generated sparse connectivity and reciprocal motifs. Purple line: analytical prediction including the first-order perturbation term in the rank-one approximation; gray: projection of simulated activity onto the connectivity vector m computed by perturbation theory Eq (104) for 30 realizations of sparse connectivity J, the shaded area reflects finite-size effects. (H) Comparison between simulations and mappings obtained from the low-rank approximation for the activity of individual units in a given realization of the sparse connectivity. For each unit i, a dot shows the deviation Δxi of the steady-state activity from the population average, against its value Δmi of the perturbed part of the connectivity vector m (Eq (16)). The low-rank theory maps Δxi = κΔmi. Orange, cyan and gray scatters show excitatory or inhibitory populations, each for two values of the reciprocal correlation ηEE. Lines represent y = κx, where κ is obtained from Eqs (19) and (20). Upper panels show the result in a realization with a high fixed point, bottom panels show the result in a realization with a low fixed point. (I) Comparison between the predictions (solid lines) and simulations (shaded areas) from the population-averaged variances of Δxi, shaded areas show mean±std and reflect finite-size effects. In C, F, G, I, the reciprocal correlations ηEE = −ηEI progressively increase from −0.43 to 1.0 while keeping ηII = −0.43 constant (ρEE = ρEI increase from 0 to 1 and ρII = 0 is fixed). In C, F, G, I, the departure of the centres of the numerical simulation results from the theoretical predictions reflect the systematic errors due to the approximation of the low-rank connectivity statistics as well as the latent dynamical variable κ. Network parameters: NE = 4NI = 800, c = 0.3, AE = 0.023, AE/AI = 0.3. The transfer function ϕ has parameter θ = 1.5.

https://doi.org/10.1371/journal.pcbi.1010855.g009

We first examine the effect of the reciprocal motifs on the statistical properties of the eigenvalues and eigenvectors. The spectrum still consists of continuous eigenvalues and a discrete outlier. Since in the independent case the outlier depends only on AE/I and sparsity c, here we fix these two variables but increase ηEE = −ηEI while keeping ηII constant. As in the case of dense, Gaussian networks, the outlier increases with the increasing reciprocal correlation and deviates from the outlier of the corresponding independent sparse connectivity matrix (Fig 9B and 9C). Moreover, in the large network limit, we find that if means, variances, and reciprocal correlations are identical, dense Gaussian connectivity leads to the same eigenvalue spectrum as the sparse connectivity (Methods Secs. 2.1.2, 2.1.3, Eqs (38) and (39)). We furthermore mathematically predict two additional conjugate eigenvalue outliers generated by the reciprocal connections in the sparse case (Eqs (82), (88) and (89), Fig 9B).

As for uncorrelated connectivity, perturbation theory accounts for the individual left and right eigenvector entries for specific instances of the sparse connectivity, which altogether follow Gaussian statistics as expected (Fig 9D and 9E, Eqs (69) and (70)). Importantly, reciprocal correlations induce a non-zero covariance σnm between the entries mi and ni of the right and left eigenvectors (Eq (99), Fig 9F).

Finally, we examine the population dynamics in the sparse network with reciprocal motifs using the low-rank approximation derived above. The reciprocal motifs in the example network generate an overall positive feedback. Therefore, gradually increasing the reciprocal correlation ηEE (−ηEI) in the example network induces a bifurcation into bistability (Eq (119), Fig 9G). Analogous to the projections depicting the individual eigenvector entries obtained using perturbation theory (Methods Sec. 2.3 Eq (69)), the low-rank approximation analytically accounts for the activity of individual neurons in specific connectivity realizations (Eq (16), Fig 9H), and hence the cell-type dependent variances of neuronal activation (Fig 9I) obtained from finite-size simulations of the original sparse networks.

Discussion

In this work, we unified two different descriptions of connectivity in multi-population networks and thereby connected two broad classes of models. Starting from local statistics of synaptic weights, we approximated the resulting connectivity matrix in terms of a Gaussian-mixture low-rank structure. The obtained, approximate low-rank network model then allowed us to determine the influence of the local connectivity motifs on the global low-dimensional dynamics.

Gaussian-mixture low-rank networks may seem to rely on major simplifying assumptions to make the dynamics more tractable and interpretable, such as Gaussian-distributed entries of the connectivity vectors that are independent across neurons. Our analyses however show that this class of models is nevertheless less restrictive than it may appear at first. Specifically, by using our analytical approach, we show that even if the distribution from which the locally-defined Jij is sampled is not Gaussian (e. g. Bernoulli), the corresponding entries mi, nj on the connectivity vectors nonetheless converge to a multi-variate Gaussian distribution in the large network limit, provided the conditions of the central limit theorem are met. Importantly, our results for the network connectivity with reciprocal motifs show that the independent assumption for mi (ni) does not rule out the possibility of reciprocally-correlated synaptic weights (Jij, Jji) in the locally-defined connectivity.

A key ingredient in our approach is a low-rank approximation of the locally-defined connectivity matrix. Approximating an arbitrary full-rank matrix by a rank-R one is a classical problem in numerical analysis, for which a number of different methods are available depending on the objective of approximation [57]. The most common method is to perform a singular value decomposition (SVD), and keep the top R terms [58]. This method minimizes the Frobenious norm of the difference between the original matrix and its low-rank approximation. Our goal in this study was however to obtain a low-rank approximation that preserves the dominant eigenvalues of the original matrix, as these eigenvalues determine the autonomous dynamics in the network. An SVD-based approximation preserves the top singular values, but in general not the top eigenvalues (S1 Fig), and this can lead to an inaccurate approximation of autonomous dynamics [59]. We therefore opted for an approximation based on truncated eigen-decomposition. When studying input-driven and transient dynamics, different methods for low-rank approximation may be more appropriate, and are a topic of active research [6065].

To perform the eigen-decomposition of excitatory-inhibitory connectivity matrices, we leveraged the fact that they can be expressed as a sum of a block-like deterministic low-rank matrix and a full-rank random matrix with zero-mean [40]. The eigenspectrum of such matrices in general consists of a continuously-distributed bulk that is attributed to the random component, and discrete eigenvalues that generated by the low-rank excitatory-and-inhibitory mean connectivity [44, 45]. When the mean of excitatory and inhibitory weights approximately cancel each another, the corresponding eigenvalue is small and resides within the bulk. A number of works have examined the bulk of the eigenvalue spectrum for random matrices [11, 18, 44, 53, 54, 56, 66, 67], and showed that the obtained eigenvalue statistics have important implications for network dynamics such as spontaneous fluctuations [68], oscillations [69, 70] and correlations in asynchronous irregular activity [71]. In contrast, in this work, we focus on the parameter regime where the discrete eigenvalues are outliers and well separated from the eigenvalue bulk. The outlying eigenvalues, and in particular the corresponding eigenvectors have to our knowledge received less attention.

The techniques used in this work on the perturbation eigenvalues can be traced back to the classic work of Tao [45]. The intriguing finding that low-rank perturbations on the i. i. d. matrices have a range of nonlinear effects on the outliers inspired our investigation: If we think of the correlated random connectivity as perturbations on the low-rank dominant structure, what effects do these have on the outliers? Using linear response theory and mean-field techniques, recent research [66] expanded Tao’s findings to network dynamics by investigating the outliers of the covariance matrix of the dynamic fluctuations in i. i. d. networks. In particular that work showed the effects of second-order motifs on the eigenvalue bulk distribution of the covariance matrix. Our analyses use the low-rank excitatory-and-inhibitory mean structure to go one step further and analytically demonstrate the impact of cell-type-dependent reciprocal motifs on the outlier λ (Methods Sec. 2.4.2).

The main technical novelty in this work is the use of matrix perturbation theory [47, 72] to approximate the eigenvectors corresponding to the outliers in the eigenspectrum of the locally-defined connectivity matrices. A key output of this approach is the finding that entries of the left- and right-eigenvectors follow multivariate Gaussian distributions, the statistics of which depend on the population the neurons belong to. In particular, these entries on the left and right vectors are uncorrelated in networks with i. i. d. local connectivity (Sec. 1.3); nonetheless, the reciprocal motifs further induce correlations between these entries on the left and right vectors and result in zero-mean overlaps between the vectors for each population. (Sec. 1.4). This result provides a general theoretical mapping from locally-defined multi-population models to Gaussian-mixture low-rank networks [31, 33]. It however holds only as long as the entries on the resulting approximation low-rank vectors satisfy the assumption of independent entries across neurons, and the distribution of synaptic weights satisfies the assumptions of the central limit theorem. This specifically rules out, for instance, strong synaptic weights that may be analogous to connectivity hubs, as well as the heavy-tailed distributions often found in experimental studies [1, 73]. Other techniques such as Feynman diagrams [74] may provide a way to further study the effects of structure on dynamics and in particular correlations.

In the networks we considered, the non-random structure in connectivity comes only from the multi-population organization. More specifically, the low-rank skeleton of the locally-defined connectivity matrix is fully specified by the mean synaptic weights between different populations (Eq (3)). This mean connectivity structure largely controls the outlying eigenvalue, and the average values of the corresponding eigenvector entries. The random part of the connectivity and reciprocal motifs can modify the outlying eigenvalue, and add heterogeneity as well as correlations to this underlying structure. The effect of changes to the underlying connectivity is to further regulate the internal network dynamics, i. e., the bistability transition [75]. Particularly, in networks with i. i. d. random connectivity, even though the latent dynamics is primarily determined by the mean excitatory-and-inhibitory connectivity, the independent random part further determines the heterogeneity of the neural activities and indirectly affects the bifurcation point (Sec. 1.5.1, Fig 5C); however, the reciprocal motifs in the random component directly control the latent dynamics by incorporating recurrent feedback with regard to mean connectivity alone (Sec. 1.5.2). Our perturbative theory moreover allows us to quantify these effects and accurately account for the heterogeneity in dynamics on a single-neuron basis. To further incorporate experimental data on synaptic connectivity into recurrent network models, an important question is how networks of rate units used here relate to more biologically realistic spiking networks where neurons interact through discrete action potentials [76, 77]. One approach for investigating this relationship is to map each spiking neuron onto a rate unit, and therefore compare rate and spiking networks with an identical connectivity matrix. Using this approach, recent work has shown that theoretical results in rate networks directly predict low-dimensional dynamics in spiking networks with identical low-rank connectivity [78]. This provides a possible justification for interpreting the connectivity in rate networks directly in terms of experimentally measured local connectivity statistics.

A key insight from our study is a general relationship between reciprocal motifs in locally-defined connectivity and overlaps among connectivity vectors in low-rank networks, which hasn’t been investigated in the previous works of the low-rank model [30, 31, 33, 79]. Indeed, we have shown that correlations between reciprocal synaptic weights generate overlaps beyond the mean in the corresponding low-rank approximation (Eq (95)). Conversely, zero-mean overlaps between connectivity vectors in a low-rank model necessarily imply non-vanishing reciprocal correlations (Eq (43) and S5 Text). Since overlaps between connectivity vectors determine the autonomous recurrent dynamics in low-rank networks, this relationship allowed us to quantify how reciprocal connectivity motifs contribute to network dynamics.

Local statistics of synaptic connectivity are believed to play an important role in the global network dynamics [1, 17, 32, 66]. Our study provides a mathematical theory that relates the local connectivity statistics to global recurrent dynamics through a low-rank approximation. In addition to the reciprocal motifs that we have focused on in this work, it has been demonstrated that other second-order motifs, including convergent, divergent, and chain motifs, have important effects on the statistics of fluctuations [14, 22, 23, 80]. Specifically, recent studies have revealed that the dimensionality of the balanced networks and the norm of the continuous eigenvalue bulk are tuned differently by the statistics of various types of motifs [15, 66, 75]. Examining the effects of these additional types of motifs on eigenvalue outliers and corresponding low-dimensional dynamics would be an important future step to connect models to large-scale electrophysiological recordings of the cortical microcircuits [6, 7, 15, 81].

2 Materials and methods

Throughout this study, we consider recurrent networks of N neurons and denote by J the recurrent connectivity matrix, where Jij is the synaptic strength of the connection from neuron j to neuron i.

2.1 Locally-defined multi-population connectivity

In this section, we introduce a first class of connectivity models, in which the synaptic couplings are generated based on local statistics determined by the identity of pre- and post-synaptic neurons. The N neurons in the network are organized in P populations, where population p has Np neurons. Denoting by p and q the populations neurons i and j belong to, the value of the synaptic coupling Jij is drawn randomly from a distribution in which statistics depend on the pre- and postsynaptic population q and p. The full connectivity matrix J therefore has a block structure, in the sense that all connections within the same block share identical statistics.

We examine two variants of this model class: (1) independent random connectivity [53]; (2) connectivity with reciprocal motifs [10, 82]. In each case, we examine two specific examples of distributions of synaptic strengths, Gaussian, and Sparse distributions.

2.1.1 Independent random connectivity.

For networks with independent random connectivity, the recurrent connections Jij are sampled independently for each (i, j) pair from (22) where fpq denotes a probability density function, and q, p are the pre- and post-synaptic populations. Separating the mean and random components, for an arbitrary distribution Eq (22) can be re-expressed as (23)

Here is the mean value of the connections from population q to population p, and zij is the remaining zero-mean random part of each connection. Defining as the N × N deterministic matrix consisting of mean values, and Z as the noise matrix consisting of the random parts zij, the connectivity matrix J can be written as (24)

The matrix is of size N × N and consists of P2 blocks with identical values within each block. The rank of is therefore at most P [40]. In contrast, the noise matrix Z is in general of rank N. The full connectivity matrix J can then be interpreted as a rank-P deterministic matrix perturbed by the random matrix Z with block-dependent statistics.

In the case of Gaussian connectivity, connections from population q to population p are sampled independently from a Gaussian distribution. (25) with variances (26)

The noise matrix Z therefore has block-structured variances that we specify by a P × P matrix Gm: (27)

In the case of sparse connectivity, is a Bernoulli random variable. The connectivity weights from population q to population p are non-zero with probability cpq and zero otherwise. All non-zero connection weights within a block take the same value Apq, so that analogously to Eq (22), the sparse connectivity is defined as (28)

The mean connectivity weight between populations p, q is then (29) and the variance of the remaining random part zij is (30) to simplify the parameters in sparse networks, we assume that Apq depend only on presynaptic population q, and that the connection probability cpq is a homogeneous network parameter independent of p, q that we denote by c.

2.1.2 Reciprocal connectivity motifs.

To go beyond independent connectivity, we consider pairwise motifs, i. e. correlations between reciprocal pairs of weights Jij and Jji. We quantify this correlation using the normalized covariance ηij defined as (31) where [⋅] denotes the average over the full connectivity distribution. Reciprocal connections are fully independent when ηij = 0 for all i, j, fully symmetric when ηij = 1 and fully anti-symmetric when ηij = −1.

Our key assumption is that the statistics of connectivity are block-like, implying that all pairs of connections between populations p, q share the same correlation coefficient ηpq, so that the statistics are defined by a P × P reciprocal correlation matrix ηm (32) where, by definition ηpq = ηqp.

For Gaussian statistics, we generate connectivity matrices with a specified set of ηpq in the following manner. We first generate an N × N matrix Y′ with entries independently sampled from the normal distribution . Then, in order to generate a matrix Y with reciprocal correlations ηpq, we form a linear combination of Y′ and its transpose . Specifically, we set (33) with for ηpq > 0, and for ηpq < 0. Finally, we scale each block by to obtain the random connectivity component Z, which is added to the mean connectivity component to finally obtain the full connectivity matrix J.

For sparse networks, we first generate a connectivity matrix without reciprocal correlations. We then consider the upper triangle of this matrix, randomly select a fraction ρpq of the non-zero connections Jij with value Aq and set their reciprocal connectivity weights Jji to have a non-zero weight Ap. For the remaining 1 − ρpq fraction of non-zero connections in the upper triangle, we set the reciprocal connectivity weights to zero. The corresponding cell-type dependent reciprocal correlations for the multi-population sparse connectivity are then (34) where c is the homogeneous connection probability (Table 1).

2.1.3 Excitatory-inhibitory networks.

In this work, we specifically focus on excitatory-inhibitory networks composed of P = 2 populations, one excitatory and one inhibitory, with respectively NE and NI neurons. We denote the two populations by indices E and I, so that there are four types of connections: EE, EI, IE and II. Based on the usual anatomical estimates for neocortex, we choose NE = 0.8N, NI = 0.2N, and further define αE = NE/N, αI = NI/N, as the fractions of excitatory and inhibitory neurons.

For Gaussian networks, we enforce Dale’s law only on the mean, i. e. we set and to be positive, while and are negative. The N × N mean connectivity matrix is therefore in general rank-two. To further simplify the setting, we follow [43], and consider networks where the mean weights of all excitatory connections, and respectively all inhibitory connections, are equal and set by parameters JE and JI: (35) (36)

Under these additional assumptions, the entries in the first NE columns of the mean connectivity matrix have the same positive weight JE/NE, and the entries in the following NI columns have the same negative weight −JI/NI, so that becomes rank one.

We however allow the variances and reciprocal correlations ηpq to depend on both the pre- and post-synaptic population, so that the corresponding parameters form 2 × 2 matrices (37) where ηEI = ηIE.

For sparse excitatory-inhibitory networks, all non-zero excitatory (resp. inhibitory) synaptic weights are equal and positive, AE > 0 (resp. AI < 0). From Eqs (29) and (30) the mean and the variance of the synaptic weights in the sparse network can be matched to the parameters of the Gaussian model [83]: (38)

In particular, for the sparse networks with pairwise reciprocal motifs, on top of the matching means and variances, the cell-type dependent reciprocal correlations satisfy (Eq (34)) (39)

2.2 Globally-defined connectivity: Gaussian-mixture low-rank networks

In this section, we introduce a second broad class of connectivity models, Gaussian-mixture low-rank networks [31, 33], in which the connectivity matrix is generated from a global statistics of vectors over neurons.

Low-rank networks are a class of recurrent neural networks in which the connectivity matrix J is restricted to be of rank R, assumed to be much smaller than the number of neurons N. Such a connectivity matrix can be expressed as a sum of R unit rank terms (40)

We refer to and as the r-th left and right connectivity vectors. The 2R connectivity vectors together fully specify the connectivity matrix. Each neuron i is then characterized by its set of 2R entries on these vectors. For unit-rank networks, the main focus of this study, the connectivity matrix is simply given by the outer product of a pair of connectivity vectors m and n: (41)

Gaussian-mixture low-rank networks are a subset of the class of low-rank networks, for which the entries of the connectivity vectors are drawn independently for each neuron from a mixture of Gaussians distribution [31]. Specifically, a fraction αp of neurons is assigned to a population p, and within each population, the entries on the connectivity vectors are generated from a given 2R-dimensional Gaussian distribution. For a unit-rank network, for a neuron i in the population p, the connectivity parameters (mi, ni) are generated from a bi-variate Gaussian distribution with mean , variance and covariance .

For any unit-rank matrix of the form in Eq (41), the only potentially non-zero eigenvalue is given by , and the corresponding right (resp. left) eigenvector is m (resp. n). For a Gaussian-mixture model, in the large N limit this eigenvalue becomes (42)

Starting from a Gaussian-mixture low-rank model in which the connectivity is globally defined, and the assumptions that mi, ni are drawn independently between neurons from a multi-variate Gaussian distribution is satisfied, it is straightforward to compute the resulting local statistics of the connectivity, i. e. the mean , variance (Methods Sec. 2.1.2) and reciprocal correlation ηpq (Methods Sec. 2.1.1) as: (43) in the large network limit. The expression for the local statistics of network connectivity using rank-R connectivity is in S5 Text.

2.3 Approximating locally-defined connectivity with Gaussian-mixture low-rank models

In this section, we describe our general approach for approximating an arbitrary connectivity matrix J with a rank-R matrix JR. We then show that for J corresponding to locally-defined multi-population connectivity (Methods Sec. 2.1), the resulting approximation JR in general obeys Gaussian-mixture low-rank statistics as defined in Methods Sec. 2.2.

The connectivity matrices J that we studied have randomly generated entries. Such matrices can be diagonalized almost surely (with probability 1) on complex numbers, as the set of non-diagonalizable matrices is of measure 0. The matrices we consider are however not symmetric and therefore non-normal [41], so that the left and right eigenvectors are in general not identical. Instead, they form a bi-orthogonal set [84]. Specifically, to approximate a full rank matrix J with a rank-R matrix JR, we use truncated eigen-decomposition, which preserves the dominant eigenvalues. We start from the full eigen-decomposition of J: (44) where λr is the r-th eigenvalue of J (ordered by decreasing absolute value), while Rr and Lr are the corresponding right- and left-eigenvectors that obey (45) (46) (47)

In the following, we constrain the right eigenvectors Rr to be of unit norm, while the normalization of the left eigenvector is determined by Eq (47).

We obtain a rank-R approximation JR of J by keeping the first R terms in Eq (44): (48)

The R non-trivial eigenvalues and eigenvectors of JR therefore correspond to the first R eigenvalues and eigenvectors of J. We then set (49) (50) to have the same normalization for JR as in Eq (40).

To obtain a low-rank approximation for a connectivity matrix J generated from locally-defined statistics defined in Methods Sec. 2.1, we first determine its dominant eigenvalues and eigenvectors. Starting from Eq (24), this problem becomes equivalent to finding the dominant eigenvalues and eigenvectors of a low-rank matrix perturbed by a random matrix Z with block-like statistics. We compute the statistics of these eigenvalues and eigenvectors by combining the Matrix’s Determinant Lemma, the Matrix Perturbation Theory and the Central Limit Theorem. Below we summarize this general approach before applying it to different specific cases in Methods Secs. 2.4, 2.5.

We focus on the case where is unit rank as in the simplified excitatory-inhibitory network introduced in Methods Sec. 2.1.3. In that case, the unique non-zero eigenvalue of is (51) and the corresponding left and right eigenvectors are (52)

can then be rewritten as (53) where the structure vectors and are uniquely defined by rescaling the left and right eigenvectors of (Eq (52)) as in Eq (49), so that (54) (55) (56)

The full connectivity matrix J can be then expressed as (57)

Eigenvalues. For a random matrix Z with independently distributed elements, the eigenvalues are distributed on a disk of radius rg centred at the origin in the complex plane [11, 44, 85]. Correlations between elements in general modify the shape of this continuous spectrum [18, 86]. In contrast, adding a low-rank component typically induces isolated eigenvalues outside the continuous part of the spectrum [44, 45]. To obtain a low-rank approximation of the full matrix, we focus on determining these outliers when they exist.

All eigenvalues λ of J satisfy the characteristic equation (58)

To determine the outlying eigenvalues of a random connectivity with low-rank structure, we apply the matrix determinant lemma to the l. h. s. of the characteristic equation [32]: (59)

As outliers by definition cannot be eigenvalues of Z, they correspond to zeros of the first term in the r. h. s., and therefore satisfy: (60) where I is the identity matrix. We assume that the maximal eigenvalue of Z is smaller than the outlying eigenvalues λ we aim to determine, the upper bound on this maximal eigenvalue is provided by the spectral norm of the Z matrix [87]. As a result, we can expand (IZ/λ)−1 in series, and further get [32] (61) with (62)

Here θ0 corresponds to the eigenvalue λ0 of (Eq (51)), and the higher order terms specify how this eigenvalue is modified by the random part of the connectivity. Truncating Eq (61) at a given order, and averaging over Z yields a polynomial equation for the mean eigenvalues of J. In Methods Sec. 2.4, we exploit this equation to determine the effects of different cell-type specific random connectivity Z on the outlying eigenvalues.

Note that within first-order perturbation theory, the eigenvalues are given by λ = λ0 + Δλ with (63)

Eigenvectors. To determine the eigenvectors corresponding to the outlying eigenvalue of J, we treat it as perturbed by Z (Eq (24)). Matrix perturbation theory then states that, at first order, the right- and left-eigenvectors R and L of J corresponding to the outlying eigenvalue λ are given by [47]: (64) (65) where and are the right- and left-eigenvectors of defined in Eq (52), and (66)

Using the normalization in Eq (49), we then get (67) with constant entries on and defined in Eqs (54) and (56) and (68) where we approximated λ at first order by λ0.

Statistics of Eigenvector entries. While and are deterministic vectors, the perturbations and are random variables obtained by multiplying and with the random matrix Z (Eq (68)). We therefore next consider the statistics of the elements mi and ni of m and n defined in Eq (67).

Since the elements of Z have zero mean, the mean values of mi and ni are given by and defined in Eqs (54) and (56). The mean value of ni, but not mi, therefore depends on whether the neuron i belongs to the excitatory or inhibitory population. Taking into account that Z has block-like statistics, we split the matrix product in Eq (68) into the sum of items corresponding to excitatory and inhibitory pre-synaptic neurons. Using Eqs (54) and (56), Δmi and Δni can be written as (69)

We next take the limit NE, NI → ∞, and apply the central limit theorem, which states that each sum converges to a Gaussian random variable, so that we have (70) where pE, I is the population the neuron i belongs to, and are the variance of zij, zji respectively, for i, j in populations p, q. The perturbations Δmi and Δni. therefore converge to Gaussian random variables of zero mean and variances and given by: (71)

In order to guarantee stability in the large network limit, we assume that the variance of the local random synaptic weights is satisfying . Consequently, we have and (Eqs (70) and (71)).

The population covariance between elements mi and ni with i belonging to population p can furthermore be written as (72) while the overall covariance σnm between all mi and ni reads (73)

Altogether, mi and ni determined by perturbation theory therefore follow Gaussian-mixture statistics, where the mean and variance depend on whether the neuron i belongs to the excitatory or inhibitory population.

Comparison with simulations. The theoretical predictions for eigenvalues obtained from Eqs (61) and (62) can be verified by comparing them with the eigenvalue outliers computed by direct eigen-decomposition of the full matrix J. We compute the average and standard deviation of eigenvalue outliers over 30 realizations of J.

The predictions of perturbation theory for eigenvectors given by Eq (66) can also be verified by direct eigen-decomposition, but to compare individual entries, an appropriate normalization is required [47]. Indeed, perturbation theory assumes that is normalized and satisfies (Eq (52)), but the perturbed eigenvectors in Eq (64) do not obey the same normalization. We therefore first use numerical eigen-decomposition to get the right- and left-eigenvector and of J. We then normalize to 1, and so that . To compare with perturbation theory, we then normalize as (74) the eigenvectors L, R after normalization have the same statistics as , (Eqs (62) and (64)).

2.4 Eigenvalues

Here, we apply Eqs (61) and (62) to determine the mean and variance of outlying eigenvalues for different forms of local connectivity statistics.

2.4.1 Independent random connectivity.

In the case of independent random connectivity, the elements of Z are zero-mean, independently distributed and uncorrelated with and . Averaging Eq (62) over Z then leads to [32]: (75) (76) in the limit N → ∞ for all k > 0, and therefore the mean eigenvalue [λ] of J is given by the eigenvalue λ0 of . For Gaussian random connectivity, we have (77) and for sparse connectivity (78)

The variance of λ can be computed by keeping only the linear term in Eq (61), which leads to Eq (63) under the assumption that λ ≈ λ0. Applying the central limit theorem then yields (79)

Given the stability guarantee , the asymptotic variance of the first-order perturbation of the eigenvalue becomes zero in the limit of large network N → ∞.

For independent Gaussian random connectivity, we substitute using Gaussian variance parameters (Eqs (25) and (27)) (80)

For independent sparse random connectivity, we replace and Jp with the variances and means of the sparse model given in Eqs (29) and (30) and get (81)

2.4.2 Reciprocal motifs.

In the case of connectivity with reciprocal correlations, zij and zji are correlated, so that the average [θk] over Z in Eq (62) is non-zero for even k. Here we compute [θk] for k = 2 and truncate Eq (61) at second order to get a third-order polynomial equation for the mean eigenvalue: (82)

For Gaussian connectivity we have θ0 = λ0 = JEJI, and θ1 is given by (83) so that [θ1] = 0.

The next term θ2 is (84)

Given the reciprocal correlations defined in Eq (31), only items with i = j in θ2 are non-zero after averaging over Gaussian realizations, so that (85)

We then write (86) and substitute Eq (86) and into Eq (85) to obtain (87)

For sparse connectivity with reciprocal motifs, the correlations can be written as (88)

Then, combining Eq (38) and Eqs (54) and (56), the second-order coefficient [θ2] for the sparse network is (89)

Using Eqs (38) and (39), it can be seen that Eq (89) is equivalent to Eq (87).

2.5 Eigenvectors

Here we apply Eqs (71) and (72) to determine the variances and covariances of eigenvector entries obtained from perturbation theory for different forms of local connectivity statistics.

2.5.1 Independent random connectivity.

In the case of independent random connectivity, because zik and zkj are not correlated in Eq (72), the covariances between the eigenvector entries are zero. For independent Gaussian connectivity, introducing Eq (26) into Eq (71) the variances of eigenvector entries can be written as (90)

For independent sparse connectivity, substituting Eq (30) into Eq (71), leads to (91)

2.5.2 Reciprocal motifs.

In the case of connectivity with reciprocal correlations, the variances of eigenvector entries are identical to the independent case.

As we have shown in Eq (72), noise correlation between the rank-one vectors arises from the correlation between pairwise random connectivity weights in the situation with reciprocal motifs, only items with i = j (for zik, zkj) in the same population q are non-zero, so that we have (92) with (93)

For Gaussian connectivity with reciprocal correlations, the covariances between entries on Z matched by population can be written as (94)

Combining Eq (94) and the mean rank-one connectivity loadings Eqs (54)–(56), we obtain the population covariances as (95)

We note that the large deviation of the dominant eigenvalue λ in the network with reciprocal motifs also increases the nonlinearity of the vector perturbations. To account for this nonlinearity, we start from Eq (42) for λ and get , then we compare with Eq (82) and get the approximation relationship (96)

Similarly, we substitute λ for λ0 in Eq (95) for the covariance of each population, and we have (97)

For sparse connectivity with reciprocal correlations, the calculations are similar, with entries of Z being Bernoulli-distributed (98) and we have the population covariance (99)

Using Eqs (38) and (39), it can be seen that Eq (99) is equivalent to Eq (97).

2.6 Dynamics

In this section, we show how approximating locally-defined connectivity by a global low-rank structure allows us to analyse the emerging low-dimensional dynamics. We first summarize the mean-field theory (MFT) for Gaussian-mixture low-rank networks [31, 33]. We then apply it to unit-rank connectivity obtained as an approximation of locally-defined connectivity. We finally compare the resulting description of the dynamics with an alternate mean-field approach for random connectivity consisting of a superposition of low-rank and full-rank random parts as in Eq (24) [30, 32].

Throughout this study, we consider recurrent networks of rate units with recurrent interactions defined by a connectivity matrix J. The dynamical activity of unit i is represented by a variable xi(t), which we interpret as the total synaptic input current. The firing rate of unit i is given by ri(t) = ϕ(xi(t)) where ϕ(x) = 1 + tanh (xθ) is a positive transfer function. We focus on networks without external inputs, so that the dynamics of synaptic input to neuron i is given by (100)

In Figs 57, we compare the dynamics determined by direct simulations for a locally-defined connectivity matrix with a mean-field description obtained for a unit-rank approximation.

2.6.1 Mean-field theory for Gaussian-mixture low-rank connectivity.

Here we review the mean-field theory for networks in which the connectivity matrix is exactly low-rank, with components of connectivity vectors moreover drawn from Gaussian-mixture distribution. Previous works have shown that in this case, the dynamics of the collective activity x(t) = {xi}i = 1…N are embedded in a linear subspace of dimension R spanned by the connectivity vectors m(r) [3033]. Thus, x(t) can be expressed as (101) where κr(t) for r = 1…R are collective latent variables that quantify the components of x(t) along the connectivity vectors m(r). We assume that m(r) are orthogonal to each other, so that κr(t) can be expressed as (102)

For simplicity, here we moreover assume that the initial value of x(t) lies in the subspace spanned by the vectors m(r). More generally, the initial state can be included as an additional input to the dynamics [31, 33].

For a unit rank connectivity , there is a single latent variable κ corresponding to the connectivity vector m, and the dynamics of x(t) is expressed as (103) with κ(t) given by (104)

Substituting Eq (103) into Eq (100) and inserting the unit-rank connectivity, the dynamics of the latent variable κ can be expressed as (105) where (106)

The quantity κrec(t) represents the total recurrent input to κ. The sum in the r. h. s. of Eq (106) can moreover be interpreted as the empirical average of niϕ(κ(t)mi) over the neurons in the network. In the limit of large network size N, this average converges to the integral of (κ(t)m) over the distribution P(m, n) of the components of connectivity vectors. For low-rank networks, the mean-field limit corresponds to replacing κrec(t) with this integral [31, 33]: (107)

In the Gaussian-mixture low-rank model, each neuron i is assigned to a population p for p = 1…P. Within each population, the components (mi, ni) are generated from a multivariate Gaussian distribution Pp(m, n), that is (108)

In the mean-field limit, κrec is therefore given by (109) where αp is the fraction of neurons in population p.

Integrating by parts, κrec can be re-expressed as (S1 Text) (110)

Here are the mean and variance of the inputs to population p, given by (111) and the symbol 〈f(μ, Δ)〉 stands for the expected value of a function f(x) with respect to a Gaussian variable x with mean and variance μ, Δ, that is (112)

Altogether, using MFT for Gaussian-mixture low-rank networks gives the closed dynamics of the latent variable κ: (113)

In particular, the corresponding steady state is given by (114)

Note that the first and second terms on the r. h. s. respectively correspond to the mean and covariance of the entries of the unit-rank connectivity vectors m and n.

2.6.2 Approximate dynamics for locally-defined connectivity.

We next apply the MFT to unit-rank connectivity obtained as an approximation of locally-defined connectivity for the different considered cases.

Independent connectivity. We start from the network with independent connectivity, in which case the unit-rank connectivity vectors obtained by approximating locally-defined connectivity have no covariance, i. e. (Methods Sec. 2.5).

The dynamical system for the latent variable κ therefore contains only the mean term (115)

For the Gaussian random model, inserting the expressions for and (Eqs (54)–(56)), the fixed point obeys (116) where the variance of connectivity components mi is given by Eq (90).

For the sparse random model, we further consider Eqs (38), (54)–(56) to obtain here, and the fixed point is (117) where is obtained from Eq (91).

Reciprocal motifs. Correlations between reciprocal connections lead to non-zero covariance between the unit-rank connectivity vectors obtained by approximating locally-defined connectivity (Methods Sec. 2.3, Eq (72)). The dynamical system for the latent variable κ therefore contains both the mean and covariance terms (Eq (114)).

For the Gaussian random model, combining Eqs (54)–(56), (90) and (97) the fixed point obeys (118) with the variance of connectivity components mi given by Eq (90). For the sparse model, combining Eqs (38), (54)–(56), (91) and (99), the fixed point obeys (119) with the variance of connectivity components mi given by Eq (91).

2.6.3 Mean-field theory for superpositions of low-rank and full rank random connectivity.

Here we review an alternate form of mean-field theory for random connectivity consisting of a superposition of a low-rank structure and full-rank random part [30, 32]. This form of MFT can be directly applied to independently generated connections, where the connectivity matrix consists precisely of a superposition of a low-rank part corresponding to the mean, and a full-rank random part corresponding to fluctuations (Eqs (24) and (121)). Extending this type of MFT to the situation where reciprocal connections are present is however challenging [18]. Moreover, in contrast to the case where connectivity is exactly low-rank, when the additional full-rank random part is present the mean-field theory describes only the steady-state activity (and linearized dynamics around it), but not the full dynamics as in Eq (100).

The key assumption of MFT for randomly connected networks is that the total input xi to each unit can be approximated as a stochastic Gaussian process [53]. The first two cumulants (mean and variance) of that Gaussian process are then computed self-consistently to characterize the steady-state activity.

At a fixed point, the total input xi obeys (120)

Replacing Jij, where i, j belong to populations p, q respectively, by the superposition of rank-one mean and full-rank random connectivity components we get (121)

Denoting by [⋅] the average over the distribution of xi, the mean of xi can then be expressed as (122) where we introduced (123) and we assumed that the zero-mean random connectivity zij is uncorrelated with the firing rate ϕ(xj), so that (124)

Similarly, the correlation between xi and xj, where iNp and jNq, is given by (125) where we assume the neuronal activities are decorrelated [ϕ(xi)ϕ(xj)] = [ϕ(xi)][ϕ(xj)] when ij. This assumption holds for independently-generated connections, but not in presence of reciprocal correlations [18]. The covariance between xi and xj therefore becomes (126)

Within the mean-field approximation, neuronal activation xi are therefore uncorrelated Gaussian variables with mean and variance given by Eqs (122) and (126) (127)

To determine and [ϕ(xk)2], we finally express Eqs (123) and (127) as Gaussian integrals over xi in population p: (128)

Here we replaced and given the eigenvector normalization in Eq (54), and the assumption that variances depend on the populations the units i and k belong to (Eqs (26), (30) and (54)). Therefore, the stationary mean and variance of the dynamics of synaptic inputs in population p are (129)

Eqs (128) and (129) give the self-consistent equations for the stationary solutions of the dynamics.

More specifically, in the Gaussian random model, we combine connectivity statistics given by Eqs (26), (54)–(56), so that we have (130) while for the sparse random model, we combine connectivity statistics given by Eqs (29) and (30), (38), (54)–(54), so that we have (131)

Supporting information

S1 Text. Dynamics in Gaussian-mixture low-rank networks.

https://doi.org/10.1371/journal.pcbi.1010855.s001

(PDF)

S2 Text. Linear stability at fixed points in rank-one networks.

https://doi.org/10.1371/journal.pcbi.1010855.s002

(PDF)

S3 Text. Comparison between dynamics in full-rank connectivity with rank-one approximation.

https://doi.org/10.1371/journal.pcbi.1010855.s003

(PDF)

S4 Text. Chaotic dynamical transition point for networks with independent connectivity.

https://doi.org/10.1371/journal.pcbi.1010855.s004

(PDF)

S5 Text. Local connectivity statistics in rank-R Gaussian-mixture models.

https://doi.org/10.1371/journal.pcbi.1010855.s005

(PDF)

S6 Text. First-order approximation of eigenvalues and eigenvectors.

https://doi.org/10.1371/journal.pcbi.1010855.s006

(PDF)

S1 Fig. Comparison of singular value decomposition- (SVD-) and eigendecomposition-based low-rank approximation.

Blue scatters in (A, B) show eigenvalue spectra of the Gaussian excitatory-inhibitory full rank matrices J, with in general rank-2 mean connectivity , and i. i. d. random parts with identical variances g2/N over neurons. Blue dots in the circular bulk show N − 1 complex eigenvalues for one realization of the random connectivity, outlying eigenvalues (blue dots) are shown for 30 realizations of the random connectivity. Dashed envelopes indicate the theoretical predictions for the radius rg = g of the circular bulk, red circles represent one of the eigenvalue of corresponding to the outlier of J. Network parameters NE = 2NI = 600, N = NE + NI, g = 0.8, , . Purple triangles in (A) show the eigenvalues of the eigendecomposition-based rank-one approximation for the corresponding 30 realizations of the full rank matrices. Their location on the y-axis is shifted to help visualization. Purple triangles in (B) show the eigenvalues of the SVD-based rank-one approximation for the corresponding 30 realizations of the full rank matrices.

https://doi.org/10.1371/journal.pcbi.1010855.s007

(TIF)

Acknowledgments

We are grateful to David Dahmen, Eric Shea-Brown and Stefano Recanatesi for helpful discussions during the revision of this manuscript.

References

  1. 1. Song S, Sjöström PJ, Reigl M, Nelson S, Chklovskii DB. Highly nonrandom features of synaptic connectivity in local cortical circuits. PLoS biology. 2005;3(3):e68. pmid:15737062
  2. 2. Perin R, Berger TK, Markram H. A synaptic organizing principle for cortical neuronal groups. Proceedings of the National Academy of Sciences. 2011;108(13):5419–5424. pmid:21383177
  3. 3. Barbour B, Brunel N, Hakim V, Nadal JP. What can we learn from synaptic weight distributions? TRENDS in Neurosciences. 2007;30(12):622–629. pmid:17983670
  4. 4. Ko H, Hofer SB, Pichler B, Buchanan KA, Sjöström PJ, Mrsic-Flogel TD. Functional specificity of local synaptic connections in neocortical networks. Nature. 2011;473(7345):87–91. pmid:21478872
  5. 5. Harris KD, Mrsic-Flogel TD. Cortical connectivity and sensory coding. Nature. 2013;503(7474):51–58. pmid:24201278
  6. 6. Seeman SC, Campagnola L, Davoudian PA, Hoggarth A, Hage TA, Bosma-Moody A, et al. Sparse recurrent excitatory connectivity in the microcircuit of the adult mouse and human cortex. elife. 2018;7:e37349. pmid:30256194
  7. 7. Campagnola L, Seeman SC, Chartrand T, Kim L, Hoggarth A, Gamlin C, et al. Local connectivity and synaptic dynamics in mouse and human neocortex. Science. 2022;375(6585):eabj5861. pmid:35271334
  8. 8. Fino E, Yuste R. Dense inhibitory connectivity in neocortex. Neuron. 2011;69(6):1188–1203. pmid:21435562
  9. 9. Vegué M, Perin R, Roxin A. On the structure of cortical microcircuits inferred from small sample sizes. Journal of Neuroscience. 2017;37(35):8498–8510.
  10. 10. Trousdale J, Hu Y, Shea-Brown E, Josić K. Impact of network structure and cellular response on spike time correlations. PLoS computational biology. 2012;8(3):e1002408. pmid:22457608
  11. 11. Aljadeff J, Stern M, Sharpee T. Transition to chaos in random networks with cell-type-specific connectivity. Physical review letters. 2015;114(8):088101. pmid:25768781
  12. 12. Aljadeff J, Renfrew D, Stern M. Eigenvalues of block structured asymmetric random matrices. Journal of Mathematical Physics. 2015;56(10):103502.
  13. 13. Aljadeff J, Renfrew D, Vegué M, Sharpee TO. Low-dimensional dynamics of structured random networks. Physical Review E. 2016;93(2):022302. pmid:26986347
  14. 14. Recanatesi S, Ocker GK, Buice MA, Shea-Brown E. Dimensionality in recurrent spiking networks: Global trends in activity and local origins in connectivity. PLoS computational biology. 2019;15(7):e1006446. pmid:31299044
  15. 15. Dahmen D, Recanatesi S, Jia X, Ocker GK, Campagnola L, Jarsky T, et al. Strong and localized coupling controls dimensionality of neural activity across brain areas. bioRxiv. 2021; p. 2020–11.
  16. 16. Potjans TC, Diesmann M. The cell-type specific cortical microcircuit: relating structure and activity in a full-scale spiking network model. Cerebral cortex. 2014;24(3):785–806. pmid:23203991
  17. 17. Brunel N. Is cortical connectivity optimized for storing information? Nature neuroscience. 2016;19(5):749–755. pmid:27065365
  18. 18. Martí D, Brunel N, Ostojic S. Correlations between synapses in pairs of neurons slow down dynamics in randomly connected neural networks. Physical Review E. 2018;97(6):062314.
  19. 19. Guzman SJ, Schlögl A, Frotscher M, Jonas P. Synaptic mechanisms of pattern completion in the hippocampal CA3 network. Science. 2016;353(6304):1117–1123. pmid:27609885
  20. 20. Lizier JT, Atay FM, Jost J. Information storage, loop motifs, and clustered structure in complex networks. Physical Review E. 2012;86(2):026110. pmid:23005828
  21. 21. Bullmore E, Sporns O. Complex brain networks: graph theoretical analysis of structural and functional systems. Nature reviews neuroscience. 2009;10(3):186–198. pmid:19190637
  22. 22. Hu Y, Trousdale J, Josić K, Shea-Brown E. Motif statistics and spike correlations in neuronal networks. Journal of Statistical Mechanics: Theory and Experiment. 2013;2013(03):P03012.
  23. 23. Hu Y, Brunton SL, Cain N, Mihalas S, Kutz JN, Shea-Brown E. Feedback through graph motifs relates structure and function in complex networks. Physical Review E. 2018;98(6):062312.
  24. 24. Ocker GK, Hu Y, Buice MA, Doiron B, Josić K, Rosenbaum R, et al. From the statistics of connectivity to the statistics of spike times in neuronal networks. Current opinion in neurobiology. 2017;46:109–119. pmid:28863386
  25. 25. Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences. 1982;79(8):2554–2558. pmid:6953413
  26. 26. Amit DJ, Gutfreund H, Sompolinsky H. Storing infinite numbers of patterns in a spin-glass model of neural networks. Physical Review Letters. 1985;55(14):1530. pmid:10031847
  27. 27. Boerlin M, Machens CK, Denève S. Predictive coding of dynamical variables in balanced spiking networks. PLoS computational biology. 2013;9(11):e1003258. pmid:24244113
  28. 28. Eliasmith C, Anderson CH. Neural engineering: Computation, representation, and dynamics in neurobiological systems. MIT press; 2003.
  29. 29. Sussillo D, Abbott LF. Generating coherent patterns of activity from chaotic neural networks. Neuron. 2009;63(4):544–557. pmid:19709635
  30. 30. Mastrogiuseppe F, Ostojic S. Linking connectivity, dynamics, and computations in low-rank recurrent neural networks. Neuron. 2018;99(3):609–623. pmid:30057201
  31. 31. Beiran M, Dubreuil A, Valente A, Mastrogiuseppe F, Ostojic S. Shaping dynamics with multiple populations in low-rank recurrent networks. Neural Computation. 2021;33(6):1572–1615. pmid:34496384
  32. 32. Schuessler F, Dubreuil A, Mastrogiuseppe F, Ostojic S, Barak O. Dynamics of random recurrent networks with correlated low-rank structure. Physical Review Research. 2020;2(1):013111.
  33. 33. Dubreuil A, Valente A, Beiran M, Mastrogiuseppe F, Ostojic S. The role of population structure in computations through neural dynamics. Nature Neuroscience. 2022; p. 1–12. pmid:35668174
  34. 34. Pereira U, Brunel N. Attractor dynamics in networks with learning rules inferred from in vivo data. Neuron. 2018;99(1):227–238. pmid:29909997
  35. 35. Landau ID, Sompolinsky H. Coherent chaos in a recurrent neural network with structured connectivity. PLoS computational biology. 2018;14(12):e1006309. pmid:30543634
  36. 36. Landau ID, Sompolinsky H. Macroscopic fluctuations emerge in balanced networks with incomplete recurrent alignment. Physical Review Research. 2021;3(2):023171.
  37. 37. Beiran M, Meirhaeghe N, Sohn H, Jazayeri M, Ostojic S. Parametric control of flexible timing through low-dimensional neural manifolds. Available at SSRN 3967676. 2021;.
  38. 38. Kadmon J, Timcheck J, Ganguli S. Predictive coding in balanced neural networks with noise, chaos and delays. Advances in neural information processing systems. 2020;33:16677–16688.
  39. 39. Logiaco L, Abbott L, Escola S. Thalamic control of cortical dynamics in a model of flexible motor sequencing. Cell reports. 2021;35(9):109090. pmid:34077721
  40. 40. Ahmadian Y, Fumarola F, Miller KD. Properties of networks with partially structured and partially random connectivity. Physical Review E. 2015;91(1):012820. pmid:25679669
  41. 41. Murphy BK, Miller KD. Balanced amplification: a new mechanism of selective amplification of neural activity patterns. Neuron. 2009;61(4):635–648. pmid:19249282
  42. 42. Pettine WW, Louie K, Murray JD, Wang XJ. Excitatory-inhibitory tone shapes decision strategies in a hierarchical neural network model of multi-attribute choice. PLoS computational biology. 2021;17(3):e1008791. pmid:33705386
  43. 43. Brunel N. Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. Journal of computational neuroscience. 2000;8(3):183–208. pmid:10809012
  44. 44. Rajan K, Abbott LF. Eigenvalue spectra of random matrices for neural networks. Physical review letters. 2006;97(18):188104. pmid:17155583
  45. 45. Tao T. Outliers in the spectrum of iid matrices with bounded rank perturbations. Probability Theory and Related Fields. 2013;155(1):231–263.
  46. 46. Rivkind A, Barak O. Local dynamics in trained recurrent neural networks. Physical review letters. 2017;118(25):258101. pmid:28696758
  47. 47. Greenbaum A, Li Rc, Overton ML. First-order perturbation theory for eigenvalues and eigenvectors. SIAM review. 2020;62(2):463–482.
  48. 48. Sommers HJ, Crisanti A, Sompolinsky H, Stein Y. Spectrum of Large Random Asymmetric Matrices. Phys Rev Lett. 1988;60:1895–1898. pmid:10038170
  49. 49. Amari SI. Homogeneous nets of neuron-like elements. Biological cybernetics. 1975;17(4):211–220. pmid:1125349
  50. 50. Amari Si. Dynamics of pattern formation in lateral-inhibition type neural fields. Biological cybernetics. 1977;27(2):77–87. pmid:911931
  51. 51. Cohen MA, Grossberg S. Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE transactions on systems, man, and cybernetics. 1983;(5):815–826.
  52. 52. Cowan JD, Neuman J, van Drongelen W. Wilson–Cowan equations for neocortical dynamics. The Journal of Mathematical Neuroscience. 2016;6(1):1–24. pmid:26728012
  53. 53. Sompolinsky H, Crisanti A, Sommers HJ. Chaos in random neural networks. Physical review letters. 1988;61(3):259. pmid:10039285
  54. 54. Ostojic S. Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons. Nature neuroscience. 2014;17(4):594–600. pmid:24561997
  55. 55. Herbert E, Ostojic S. The impact of sparsity in low-rank recurrent neural networks. bioRxiv. 2022;.
  56. 56. Tao T, Vu V, Krishnapur M. Random matrices: Universality of ESDs and the circular law. The Annals of Probability. 2010;38(5):2023–2065.
  57. 57. Markovsky I. Low rank approximation: algorithms, implementation, applications. vol. 906. Springer; 2012.
  58. 58. Eckart C, Young G. The approximation of one matrix by another of lower rank. Psychometrika. 1936;1(3):211–218.
  59. 59. Benaych-Georges F, Nadakuditi RR. The singular values and vectors of low rank perturbations of large rectangular random matrices. Journal of Multivariate Analysis. 2012;111:120–135.
  60. 60. Farrell BF, Ioannou PJ. Accurate low-dimensional approximation of the linear dynamics of fluid flow. Journal of the atmospheric sciences. 2001;58(18):2771–2789.
  61. 61. Durstewitz D. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements. PLoS computational biology. 2017;13(6):e1005542. pmid:28574992
  62. 62. Schaeffer R, Khona M, Meshulam L, Fiete IR, et al. Reverse-engineering recurrent neural network solutions to a hierarchical inference task for mice. bioRxiv. 2020;.
  63. 63. Bondanelli G, Deneux T, Bathellier B, Ostojic S. Network dynamics underlying OFF responses in the auditory cortex. Elife. 2021;10:e53151. pmid:33759763
  64. 64. Langdon C, Engel TA. Latent circuit inference from heterogeneous neural responses during cognitive tasks. bioRxiv. 2022;.
  65. 65. Valente A, Pillow JW, Ostojic S. Extracting computational mechanisms from neural data using low-rank RNNs. In: Oh AH, Agarwal A, Belgrave D, Cho K, editors. Advances in Neural Information Processing Systems; 2022. Available from: https://openreview.net/forum?id=M12autRxeeS.
  66. 66. Hu Y, Sompolinsky H. The spectrum of covariance matrices of randomly connected recurrent neuronal networks with linear dynamics. PLoS computational biology. 2022;18(7):e1010327. pmid:35862445
  67. 67. Clark DG, Abbott L, Litwin-Kumar A. Dimension of Activity in Random Neural Networks. arXiv preprint arXiv:220712373. 2022;.
  68. 68. Amit DJ, Brunel N. Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex. Cerebral cortex (New York, NY: 1991). 1997;7(3):237–252. pmid:9143444
  69. 69. Bimbard C, Ledoux E, Ostojic S. Instability to a heterogeneous oscillatory state in randomly connected recurrent networks with delayed interactions. Physical Review E. 2016;94(6):062207. pmid:28085410
  70. 70. Beiran M, Ostojic S. Contrasting the effects of adaptation and synaptic filtering on the timescales of dynamics in recurrent networks. PLoS computational biology. 2019;15(3):e1006893. pmid:30897092
  71. 71. Renart A, De La Rocha J, Bartho P, Hollender L, Parga N, Reyes A, et al. The asynchronous state in cortical circuits. science. 2010;327(5965):587–590. pmid:20110507
  72. 72. Stewart GW. Matrix Algorithms: Volume II: Eigensystems. SIAM; 2001.
  73. 73. Buzsáki G, Mizuseki K. The log-dynamic brain: how skewed distributions affect network operations. Nature Reviews Neuroscience. 2014;15(4):264–278.
  74. 74. Ocker GK, Josić K, Shea-Brown E, Buice MA. Linking structure and activity in nonlinear spiking networks. PLoS computational biology. 2017;13(6):e1005583. pmid:28644840
  75. 75. Nykamp DQ, Friedman D, Shaker S, Shinn M, Vella M, Compte A, et al. Mean-field equations for neuronal networks with arbitrary degree distributions. Physical Review E. 2017;95(4):042323. pmid:28505854
  76. 76. Ostojic S, Brunel N. From spiking neuron models to linear-nonlinear models. PLoS computational biology. 2011;7(1):e1001056. pmid:21283777
  77. 77. Gerstner W, Kistler WM, Naud R, Paninski L. Neuronal dynamics: From single neurons to networks and models of cognition. Cambridge University Press; 2014.
  78. 78. Cimesa L, Ciric L, Ostojic S. Geometry of population activity in spiking networks with low-rank structure. bioRxiv. 2022;.
  79. 79. Valente A, Ostojic S, Pillow J. Probing the relationship between linear dynamical systems and low-rank recurrent neural network models. Neural Computation. 2022;34(9):1871–1892. pmid:35896161
  80. 80. Hu Y, Trousdale J, Josić K, Shea-Brown E. Local paths to global coherence: Cutting networks down to size. Physical Review E. 2014;89(3):032802. pmid:24730894
  81. 81. Luo L. Architectures of neuronal circuits. Science. 2021;373(6559):eabg7285. pmid:34516844
  82. 82. Perin R, Berger TK, Markram H. A synaptic organizing principle for cortical neuronal groups. Proceedings of the National Academy of Sciences. 2011;108(13):5419–5424. pmid:21383177
  83. 83. Fischer KH, Hertz JA. Spin glasses. 1. Cambridge university press; 1993.
  84. 84. Horn RA, Johnson CR. Matrix analysis. Cambridge university press; 2012.
  85. 85. Girko VL. Circular law. Theory of Probability & Its Applications. 1985;29(4):694–706.
  86. 86. Kuczala A, Sharpee TO. Eigenvalue spectra of large correlated random matrices. Physical Review E. 2016;94(5):050101. pmid:27967175
  87. 87. Dunford N, Schwartz JT, Badé WG, Bartle RG. Linear Operators: Self Adjoint Operators in Hilbert Space. Interscience Publishers; 1963.