Overcoming the Curse of Dimensionality in Neural Networks
Let be a set, a real Hilbert space. Let be a real Hilbert space of functions for which there exists such that for all , . For , let comprise our dataset. Let and be the unique global minimizer of the functional \begin{equation*} u(f) = \frac{q}{2}\Vert f\Vert_{H}^{2} + \frac{1-q}{2n}\sum_{i=1}^{n}\Vert f(x_i)-y_i\Vert_{V}^{2}. \end{equation*} In this paper we show that for each there exists a two layer network where the first layer has basis functions associated with the Hilbert space and the second layer is a weighted sum of the first layer, such that the functions realized by these networks satisfy \begin{equation*} \Vert f_{k}-f^*\Vert_{H}^{2} \leq \Bigl( o(1) + \frac{C}{q^2} E\bigl[ \Vert Du_{I}(f^*)\Vert_{H^{*}}^{2} \bigr] \Bigr)\frac{1}{k}. \end{equation*} Let us note that do not need to be in a linear space and are in a possibly infinite dimensional Hilbert space . The error estimate is independent of the data size and in the case is finite dimensional the error estimate is also independent of the dimension of .
View on arXiv