Abstract
We analyze the dynamics and statistical mechanics of attractor neural networks with ''distributed'' updating rules in which groups of one or more neurons are updated simultaneously. Such partially parallel updating schemes are a central feature of neural-network architectures that use many processors, implemented either on special multiprocessor hardware, or among many computers linked over a network. Several updating rules are classified and discussed;these rules generalize the parallel dynamics of the Little model and the one-at-a-time dynamics of the Hopfield model. Analytic results presented herein include a stability criterion that specifies sufficient conditions under which distributed dynamics lead to fixed-point attractors. For binary neurons with block-sequential updating and a Hebbian learning rule, the storage capacity is found as a function of the number of update groups. Several open problems are also discussed.
Dokumententyp: | Zeitschriftenartikel |
---|---|
Fakultät: | Biologie > Department Biologie II > Neurobiologie |
Themengebiete: | 500 Naturwissenschaften und Mathematik > 570 Biowissenschaften; Biologie |
ISSN: | 1063-651X |
Sprache: | Englisch |
Dokumenten ID: | 60888 |
Datum der Veröffentlichung auf Open Access LMU: | 11. Mrz. 2019, 14:16 |
Letzte Änderungen: | 04. Nov. 2020, 13:39 |