Skip to main content
Advertisement
  • Loading metrics

PLoS Computational Biology Issue Image | Vol. 19(1) February 2023

Non-linear gating for continual learning

The image shows a feedforward neural network that has learned to selectively gate task-relevant information. Units in the hidden layer respond to task variables in a linear fashion if their net input is positive. For negative inputs, they remain silent. The gating is implemented by moving inputs to these units in and out of their active regime (left). For continual learning, this has the effect of silencing units relevant for previous tasks and preventing their weights from being changed, as gradients are only computed for units that are active in the current task (right). Rather than hand-crafting this gating scheme, in our publication we demonstrate how it can be learned by the network with a simple Hebbian update step. Flesch et al 2023

Image Credit: T. Flesch, D.G. Nagy

thumbnail
Non-linear gating for continual learning

The image shows a feedforward neural network that has learned to selectively gate task-relevant information. Units in the hidden layer respond to task variables in a linear fashion if their net input is positive. For negative inputs, they remain silent. The gating is implemented by moving inputs to these units in and out of their active regime (left). For continual learning, this has the effect of silencing units relevant for previous tasks and preventing their weights from being changed, as gradients are only computed for units that are active in the current task (right). Rather than hand-crafting this gating scheme, in our publication we demonstrate how it can be learned by the network with a simple Hebbian update step. Flesch et al 2023

Image Credit: T. Flesch, D.G. Nagy

https://doi.org/10.1371/image.pcbi.v19.i01.g001