Latching dynamics in neural networks with synaptic depression

Prediction is the ability of the brain to quickly activate a target concept in response to a related stimulus (prime). Experiments point to the existence of an overlap between the populations of the neurons coding for different stimuli, and other experiments show that prime-target relations arise in the process of long term memory formation. The classical modelling paradigm is that long term memories correspond to stable steady states of a Hopfield network with Hebbian connectivity. Experiments show that short term synaptic depression plays an important role in the processing of memories. This leads naturally to a computational model of priming, called latching dynamics; a stable state (prime) can become unstable and the system may converge to another transiently stable steady state (target). Hopfield network models of latching dynamics have been studied by means of numerical simulation, however the conditions for the existence of this dynamics have not been elucidated. In this work we use a combination of analytic and numerical approaches to confirm that latching dynamics can exist in the context of a symmetric Hebbian learning rule, however lacks robustness and imposes a number of biologically unrealistic restrictions on the model. In particular our work shows that the symmetry of the Hebbian rule is not an obstruction to the existence of latching dynamics, however fine tuning of the parameters of the model is needed.


Stability of learned patterns in the absence of synaptic depression
The example with three neurons (n = 3) shown earlier corresponds to the ideal situation where there is an open domain in the space of parameters of the problem, such that the heteroclinic chain connecting learned patterns along edges of the hypercube [0, 1] n indeed exists. Ideally we would like to derive a set of necessary and sufficient conditions in a general setting, however this seems to be much more difficult to obtain with larger networks and longer chains. Nevertheless dynamics following edges of the hypercube that visit learned patterns in an a priori specified sequence, with the coefficients derived using the conditions presented in Results section, can be observed as evidenced by our simulations. Here we derive numerous conditions that the coefficients of J max must satisfy, which strongly restrict the choice of these quantities.
Recall that, in our setting, a learned pattern is a stable (when no synaptic depression is present) vertex equilibrium ξ = (ξ 1 , . . . , ξ n ) where each ξ j = 0 or 1. Let us consider a sequence of learned patterns ξ 1 → · · · → ξ p such that each of them has exactly m excited neurons (with entry 1) and the switching from one pattern to the next corresponds to switching values in two entries. Possibly after re-arrangement of the indices it is no loss of generality to assume that For any pattern ξ i (i < p) in the above sequence, let y be the coordinate corresponding to its first non-zero entry and z be the coordinate corresponding to the first 0 entry after the sequence of 1's. For example in ξ 1 , y = x 1 and z = x m+1 . We also noteξ i the vertex equilibrium which makes the connection from ξ i to ξ i+1 : its coordinates are those of ξ i except the i-th entry which is 0 instead of 1. By assumption these intermediate statesξ i are not learned patterns and are saddles for the dynamics of (2) with all s i fixed to 1. We specifically require the eigenvalue atξ i along y be negative and the eigenvalue along z be positive. Initially the learned patterns ξ i are stable equilibria of eqs (2) and synaptic variables s i are assigned their maximal value 1. As time evolves the values s j (t) corresponding to excited neurons (x j (t) > 0) decrease and the eigenvalues (7) are modified. We expect that after some time t i the eigendirection y at ξ i becomes unstable and a heteroclinic connection is established along this edge toξ i . The synaptic variables play the role of dynamically evolving parameters. If they did not depend on time, the situation could be described as a classical bifurcation: as long as the eigenvalues at ξ i andξ i are both negative an unstable equilibrium exists on the edge joining the two states. When the synaptic variables associated with the excited neurons decrease this equilibrium moves towards ξ i and when it merges in it, the heteroclinic connection is established (see figure S1). We also expectξ i to have an unstable eigendirection along z so that a heteroclinic connection exists fromξ i to ξ i+1 . These scenarios can be explicitely described by writing the equations restricted to the edges with variable coordinates y and z.
This process should repeat itself from i = 1 to i = p − 1.

Fig. S1: Connections from
We make a simplifying assumption that the entries of J max are 0 outside of a band around the diagonal of width 2m−1. It follows from this assumption and from (7) that when s j = 1 the eigenvalues at each pattern ξ i have the following form We require all these eigenvalues to be negative. This gives two conditions, which we now specify. Let We define The two requirements become Note that Λ i,k = 0 if |k −i| < m and equals 0 otherwise. Note also that the computation of Λ m involves non-diagonal elements only. Hence by making the diagonal elements large enough, it is possible to ensure that conditions (S3) and (S4) hold.
Necessary conditions on J max for the transition from one pattern to the next We analyze the transition atξ i between ξ i and ξ i+1 . Let σ i y ,σ i z be the eigenvalues along eigendirections y and z atξ i and time t. From (7) we havê We want that during an interval of time 0 <t i 1 < t <t i 2σ i y becomes negative whileσ i z is positive, which give the condition The quantity S = (1 + τ U ) −1 is the minimal value that can be attained by the synaptic variables s j . Recall the assumption that the elements of J max are 0 outside a strip of width 2m−1 about the diagonal. Then (S6) implies a weaker but simpler condition: When m = 2 (S6) implies the following simple condition: Proposition 1. If m = 2, then (S6) implies that J max i,i+1 < J max i+2,i+1 = J max i+1,i+2 (by symmetry of the matrix). Therefore the upper diagonal of J max has coefficients with increasing values (and the subdiagonal too).
Indeed, this condition does not hold for matrix (23) while matrix (24) satisfies it. For the case m = 2, in addition to the requirement that the off diagonal coefficients must increase, we deduce the following simple conditions: (J max i,i + J max i,i−1 ).

(S8)
A geometric representation of (S8) similar to Fig. 1 could be constructed. It is straightforward to verify that the conditions are satisfied for the coefficients of (26).