Abstract
Recurrent networks have been proposed as a model of associative memory. In such models, memory items are stored in the strength of connections between neurons. These modifiable connections or synapses constitute a shared resource among all stored memories, limiting the capacity of the network. Synaptic plasticity at different time scales can play an important role in optimizing the representation of associative memories, by keeping them sparse, uncorrelated and non-redundant. Here, we use a model of sequence memory to illustrate how plasticity allows a recurrent network to self-optimize by gradually re-encoding the representation of its memory items. A learning rule is used to sparsify large patterns, i.e., patterns with many active units. As a result, pattern sizes become more homogeneous, which increases the network's dynamical stability during sequence recall and allows more patterns to be stored. Last, we show that the learning rule allows for online learning in that it keeps the network in a robust dynamical steady state while storing new memories and overwriting old ones.
Dokumententyp: | Zeitschriftenartikel |
---|---|
Fakultät: | Biologie > Department Biologie II > Neurobiologie |
Themengebiete: | 500 Naturwissenschaften und Mathematik > 570 Biowissenschaften; Biologie |
URN: | urn:nbn:de:bvb:19-epub-60806-6 |
ISSN: | 1663-3563 |
Sprache: | Englisch |
Dokumenten ID: | 60806 |
Datum der Veröffentlichung auf Open Access LMU: | 05. Mrz. 2019, 08:10 |
Letzte Änderungen: | 04. Nov. 2020, 13:39 |