Deep neural network approximation of analytic functions

Abstract
We provide an entropy bound for the spaces of neural networks with piecewise linear activation functions, such as the ReLU and the absolute value functions. This bound generalizes the known entropy bound for the space of linear functions on and it depends on the value at the point of the networks obtained by taking the absolute values of all parameters of original networks. Keeping this value together with the depth, width and the parameters of the networks to have logarithmic dependence on , we -approximate functions that are analytic on certain regions of .
View on arXivComments on this paper