Logo Logo
Hilfe
Hilfe
Switch Language to English

Gonon, Lukas; Grigoryeva, Lyudmila und Ortega, Juan-Pablo (2023): Approximation bounds for random neural networks and reservoir systems. In: Annals of Applied Probability, Bd. 33, Nr. 1: S. 28-69

Volltext auf 'Open Access LMU' nicht verfügbar.

Abstract

This work studies approximation based on single-hidden-layer feedforward and recurrent neural networks with randomly generated internal weights. These methods, in which only the last layer of weights and a few hyperparameters are optimized, have been successfully applied in a wide range of static and dynamic learning problems. Despite the popularity of this approach in empirical tasks, important theoretical questions regarding the relation between the unknown function, the weight distribution, and the approximation rate have remained open. In this work it is proved that, as long as the unknown function, functional, or dynamical system is sufficiently regular, it is possible to draw the internal weights of the random (recurrent) neural network from a generic distribution (not depending on the unknown object) and quantify the error in terms of the number of neurons and the hyperparameters. In particular, this proves that echo state networks with randomly generated weights are capable of approximating a wide class of dynamical systems arbitrarily well and thus provides the first mathematical explanation for their empirically observed success at learning dynamical systems.

Dokument bearbeiten Dokument bearbeiten