Genetic Redundancies Enhance Information Transfer in Noisy Regulatory Circuits

Cellular decision making is based on regulatory circuits that associate signal thresholds to specific physiological actions. This transmission of information is subjected to molecular noise what can decrease its fidelity. Here, we show instead how such intrinsic noise enhances information transfer in the presence of multiple circuit copies. The result is due to the contribution of noise to the generation of autonomous responses by each copy, which are altogether associated with a common decision. Moreover, factors that correlate the responses of the redundant units (extrinsic noise or regulatory cross-talk) contribute to reduce fidelity, while those that further uncouple them (heterogeneity within the copies) can lead to stronger information gain. Overall, our study emphasizes how the interplay of signal thresholding, redundancy, and noise influences the accuracy of cellular decision making. Understanding this interplay provides a basis to explain collective cell signaling mechanisms, and to engineer robust decisions with noisy genetic circuits.


Suprathreshold stochastic resonance
Suprathreshold stochastic resonance is a class of stochastic resonance phenomenon that explains how the transmission of information by a summing network of N identical nonlinear devices can be enhanced in the presence of noise. In the original work [1], each nonlinear device denotes a threshold unit, which is subject to the same input signal x(t) but independent Gaussian noise ξ i (t), with same standard deviation (Fig. S2A). Each unit i is modeled as a Heaviside function (with threshold level θ i ), and each output, y i (t), is equal to unity if x(t) + ξ i (t) > θ i (being zero otherwise). This architecture was similar in spirit to earlier work on the signal-processing activity of neuronal ensembles (the presence of a threshold represents the "firing" of a neuron). Finally, the response of the network y(t) is obtained by summing the individual responses of each device, i.e., it represents the number of devices that are triggered.
The information processing capabilities of this system was measured by the mutual information I(x, y) = − P (x, y) log 2 P (x, y) P (x)P (y) dxdy, where P (x, y) in the joint probability distribution of input and output, and P (x) and P (y) are the corresponding marginal probabilities (I is measured in bits). Noisier input would generally lead then to lower I, however there are situations in which performance is enhanced by the presence of a certain amount of noise (when there exists several copies of the device, Fig. S2B). Note that the use of other typical measures of dependence, like the correlation coefficient, would be less sensitive. To examine this issue, we computed the transmitted information of an input signal as a function of noise strength using either the correlation coefficient, r, or the mutual information I (for an array of N = 5 threshold units). Notably, the correlation coefficient is not strongly affected by noise strength, while the mutual information changes and exhibits a maximum, i.e., two explicit regimes of low/high noise present similar correlation coefficient r 1 ∼ r 2 but different mutual information I 1 > I 2 (Figs. S2C-D). Since this resonance phenomenon was shown to appear regardless of whether the signal is entirely subthreshold or not the term suprathreshold was considered, to distinguish it from more standard stochastic resonance phenomena in single threshold devices (where suprathreshold signals do not experience resonance).

Extrinsic noise, cross-talk and heterogeneity
We considered the architecture of N bistable units to specifically characterize the effects of extrinsic noise, cross-talk, and heterogeneity on information transfer. Recall that in this case each i-th unit corresponds to a minimal implementation of a bistable system as where expression and time are appropriately rescaled to have a dimensionless model. The parameter values as well as the amplitude and statistics of ξ i as described in main text (Materials and Methods).
To account for extrinsic noise, we introduced a new stochastic process (ξ ex ), common to all units, in Eq. (1) as The correlation time of extrinsic noise is of the order of the cell cycle [2] (the mean is 0). For simplicity, we assumed a system implemented with short-lived proteins, so we can assume that ξ ex is constant within the time window that the unit needs to reach its steady state upon receiving the perturbation x.
To account for heterogeneity in the different units of the system, we followed where the standard deviation of ω i (a Gaussian random number with mean 1) quantifies the degree of heterogeneity.
To account for certain cross-talk between the different units of the summing network, we applied a perturbative approach on Eq. (1) to obtain where ε quantifies the degree of cross-talk. We assumed q i (y i ) not to be dependent on y j for j = i.

Calculation of mutual information
To calculate mutual information, we solved numerically the following integral where we considered log x as input and ∆y as output variables. By using the Fokker-Planck equation, we calculated the probability that a unit has a given expression level, P i (y i |x) [3]. The effective stochastic potential (φ i ) associated to Eq. (1) having defined f i as the right-hand side of the corresponding dynamical equations [e.g., eqn. (1) for the bistable system] without the stochastic process. Thus, P i (y i |x) = Ce −2φ i (y i ,x) , where C is a normalization constant so that ∞ 0 P i (s|x)ds = 1. In addition, when considering extrinsic noise (for the bistable system), we got where ζ is a Gaussian random number with mean 0 and standard deviation q ex . In case of cross-talk, we got Finally, in the case of the excitable system, we calculated P i (y i , z i |x) numerically. Moreover, for the non-dynamic situation, i.e., passing signals through a threshold device described by a Heaviside function (Fig. S2), we also computed mutual information numerically.

Calculation of distance to linear response
To calculate distance to linear response (d lin ) [4], we averaged all realizations of y i for a given x to obtain an average response ( ∆y i (x) , taking into account the initial conditions). Note that x varies uniformly between x min and x max . In case of the bistable system, the two output values in the deterministic regime are approximately α 0 and (α + √ α 2 − 4)/2. Then, we considered the ideal linear response as y lin = α 0 + α + √ α 2 − 4 2 x − x min x max − x min . And the calculation of the distance reads