Towards a regularity theory for ReLU networks – chain rule and global error estimates

Author(s)
Julius Berner, Dennis Elbrächter, Philipp Grohs, Arnulf Jentzen
Abstract

Although for neural networks with locally Lipschitz continuous activation functions the classical derivative exists almost everywhere, the standard chain rule is in general not applicable. We will consider a way of introducing a derivative for neural networks that admits a chain rule, which is both rigorous and easy to work with. In addition we will present a method of converting approximation results on bounded domains to global (pointwise) estimates. This can be used to extend known neural network approximation theory to include the study of regularity properties. Of particular interest is the application to neural networks with ReLU activation function, where it contributes to the understanding of the success of deep learning methods for high-dimensional partial differential equations.

Organisation(s)
Department of Mathematics, Research Network Data Science
External organisation(s)
Eidgenössische Technische Hochschule Zürich
Pages
1-5
DOI
https://doi.org/10.1109/SampTA45681.2019.9031005
Publication date
2019
Peer reviewed
Yes
Austrian Fields of Science 2012
101031 Approximation theory, 102018 Artificial neural networks
ASJC Scopus subject areas
Analysis, Signal Processing, Applied Mathematics, Statistics and Probability, Statistics, Probability and Uncertainty
Portal url
https://ucris.univie.ac.at/portal/en/publications/towards-a-regularity-theory-for-relu-networks--chain-rule-and-global-error-estimates(41d899d4-2590-4e14-9c98-c730bc8b09be).html