14
3

Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in L2(Rd,γd)L^2(\mathbb{R}^d,γ_d)

Abstract

For artificial deep neural networks, we prove expression rates for analytic functions f:RdRf:\mathbb{R}^d\to\mathbb{R} in the norm of L2(Rd,γd)L^2(\mathbb{R}^d,\gamma_d) where dN{}d\in {\mathbb{N}}\cup\{ \infty \}. Here γd\gamma_d denotes the Gaussian product probability measure on Rd\mathbb{R}^d. We consider in particular ReLU and ReLUk{}^k activations for integer k2k\geq 2. For dNd\in\mathbb{N}, we show exponential convergence rates in L2(Rd,γd)L^2(\mathbb{R}^d,\gamma_d). In case d=d=\infty, under suitable smoothness and sparsity assumptions on f:RNRf:\mathbb{R}^{\mathbb{N}}\to\mathbb{R}, with γ\gamma_\infty denoting an infinite (Gaussian) product measure on RN\mathbb{R}^{\mathbb{N}}, we prove dimension-independent expression rate bounds in the norm of L2(RN,γ)L^2(\mathbb{R}^{\mathbb{N}},\gamma_\infty). The rates only depend on quantified holomorphy of (an analytic continuation of) the map ff to a product of strips in Cd\mathbb{C}^d. As an application, we prove expression rate bounds of deep ReLU-NNs for response surfaces of elliptic PDEs with log-Gaussian random field inputs.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.