Logo Logo
Hilfe
Hilfe
Switch Language to English

Illium, Steffen; Schillman, Thore; Müller, Robert; Gabor, Thomas und Linnhoff-Popien, Claudia (2022): Empirical Analysis of Limits for Memory Distance in Recurrent Neural Networks. 14th International Conference on Agents and Artificial Intelligence (ICAART 2022), Online, 3-5 February, 2022. In: Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 3, S. 308-315

Volltext auf 'Open Access LMU' nicht verfügbar.

Abstract

Common to all different kinds of recurrent neural networks (RNNs) is the intention to model relations between data points through time. When there is no immediate relationship between subsequent data points (like when the data points are generated at random, e.g.), we show that RNNs are still able to remember a few data points back into the sequence by memorizing them by heart using standard backpropagation. However, we also show that for classical RNNs, LSTM and GRU networks the distance of data points between recurrent calls that can be reproduced this way is highly limited (compared to even a loose connection between data points) and subject to various constraints imposed by the type and size of the RNN in question. This implies the existence of a hard limit (way below the information-theoretic one) for the distance between related data points within which RNNs are still able to recognize said relation.

Dokument bearbeiten Dokument bearbeiten