Learning in Willshaw-type associative networks.
A, Memory storage by Hebbian weight plasticity (Eq. 5) in a fully connected network (P = 1). Address patterns uμ are associated to content patterns vμ where μ = 1,…,M (here M = 2). Each memory is represented by a binary activity vector of length n = 7 having k = 4 active units (which define the corresponding cell assembly). B, One-step retrieval of the first memory from a noisy query pattern
u˜≈u1 having two of the four active units in u1 (λ = 0.5). Here
u˜≈u1 can perfectly reactivate the corresponding memory pattern in population v (
v^=v1) applying a firing threshold
Θ=∑iu˜i=2 on dendritic potentials
xj=∑i=1mWiju˜i. C, As a simple form of structural plasticity, silent synapses can be pruned after learning. The resulting network has only 28 (instead of 49) synapses corresponding to a lower anatomical connectivity P ≈ 0.57, whereas the effectual connectivity is still Peff = 1. Thus, pruning does not change network function, but increases stored information per synapse. D, Ongoing structural plasticity can similarly increase storage capacity during more realistic learning in networks with low anatomical connectivity (here P = 28/49 ≈ 0.57). During each time step t = 1, 2, 3, 4, Hebbian weight plasticity potentiates and consolidates synapses ij with non-zero consolidation signal Sij > 0 (which equals Wij of panel A), whereas the remaining silent synapses are eliminated and replaced by new synapses at random locations. Note that the resulting network at t = 4 is the same as in panel C.