Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in

For artificial deep neural networks, we prove expression rates for analytic functions in the norm of where . Here denotes the Gaussian product probability measure on . We consider in particular ReLU and ReLU activations for integer . For , we show exponential convergence rates in . In case , under suitable smoothness and sparsity assumptions on , with denoting an infinite (Gaussian) product measure on , we prove dimension-independent expression rate bounds in the norm of . The rates only depend on quantified holomorphy of (an analytic continuation of) the map to a product of strips in . As an application, we prove expression rate bounds of deep ReLU-NNs for response surfaces of elliptic PDEs with log-Gaussian random field inputs.
View on arXiv