< Back to Article

Learning spatiotemporal signals using a recurrent spiking network that discretizes time

Fig 3

Learning a non-Markovian sequence via the read-out neurons.

(A) Excitatory neurons in the recurrent network are all-to-all connected to the read-out neurons. The read-out neurons receive additional excitatory input from the supervisor neurons and inhibitory input from interneurons. The supervisor neurons receive spike trains that are drawn from a Poisson process with a rate determined by the target sequence. The read-out synapses are plastic under the voltage-based STDP rule. (B) The rate of the input signal to the supervisor neurons A, B and C. The supervisor sequence is ABCBA where each letter represents a 75 ms external stimulation of 10 kHz of the respective supervisor neuron. (C) After learning, the supervisor input and plasticity are turned off. The read-out neurons are now solely driven by the recurrent network. (D) The read-out weight matrix WRE after 12 seconds of learning. (E) Under spontaneous activity, the spikes of recurrent network (top) and read-out (bottom) neurons. Excitatory neurons in the recurrent network reliably drive sequence replays. (F) The target rate (top) and the rate of the read-out neurons (bottom) computed using a one sequence replay and normalized to [0, 1]. The spikes of the read-out neurons are convolved with a Gaussian kernel with a width of ∼ 12 ms.

Fig 3