Logo
EnglishCookie löschen - von nun an wird die Spracheinstellung Ihres Browsers verwendet.
Leibold, Christian; Bendels, Michael H. K. (2009): Learning to Discriminate Through Long-Term Changes of Dynamical Synaptic Transmission. In: Neural Computation, Vol. 21, Nr. 12: S. 3408-3428
[img]
Vorschau

PDF

555kB

Abstract

Short-term synaptic plasticity is modulated by long-term synaptic changes. There is, however, no general agreement on the computational role of this interaction. Here, we derive a learning rule for the release probability and the maximal synaptic conductance in a circuit model with combined recurrent and feedforward connections that allows learning to discriminate among natural inputs. Short-term synaptic plasticity thereby provides a nonlinear expansion of the input space of a linear classifier, whereas the random recurrent network serves to decorrelate the expanded input space. Computer simulations reveal that the twofold increase in the number of input dimensions through short-term synaptic plasticity improves the performance of a standard perceptron up to 100%. The distributions of release probabilities and maximal synaptic conductances at the capacity limit strongly depend on the balance between excitation and inhibition. The model also suggests a new computational interpretation of spikes evoked by stimuli outside the classical receptive field. These neuronal activitiesmay reflect decorrelation of the expanded stimulus space by intracortical synaptic connections.