ORCID: https://orcid.org/0000-0001-9738-2487 und Petersen, Philipp
(2020):
Error bounds for approximations with deep ReLU neural networks in Ws,p norms.
In: Analysis and Applications, Bd. 18, Nr. 5: S. 803-859
Abstract
We analyze to what extent deep Rectified Linear Unit (ReLU) neural networks can efficiently approximate Sobolev regular functions if the approximation error is measured with respect to weaker Sobolev norms. In this context, we first establish upper approximation bounds by ReLU neural networks for Sobolev regular functions by explicitly constructing the approximate ReLU neural networks. Then, we establish lower approximation bounds for the same type of function classes. A trade-off between the regularity used in the approximation norm and the complexity of the neural network can be observed in upper and lower bounds. Our results extend recent advances in the approximation theory of ReLU networks to the regime that is most relevant for applications in the numerical analysis of partial differential equations.
Dokumententyp: | Zeitschriftenartikel |
---|---|
Fakultät: | Mathematik, Informatik und Statistik > Mathematik > Professur für Mathematische Grundlagen des Verständnisses der künstlichen Intelligenz |
Themengebiete: | 500 Naturwissenschaften und Mathematik > 510 Mathematik |
ISSN: | 0219-5305 |
Sprache: | Englisch |
Dokumenten ID: | 126400 |
Datum der Veröffentlichung auf Open Access LMU: | 27. Mai 2025 10:23 |
Letzte Änderungen: | 27. Mai 2025 10:23 |