Overcoming the Curse of Dimensionality in Neural Networks
Let be a set and a real Hilbert space. Let be a real Hilbert space of functions and assume is continuously embedded in the Banach space of bounded functions. For , let comprise our dataset. Let and be the unique global minimizer of the functional \begin{equation*} u(f) = \frac{q}{2}\Vert f\Vert_{H}^{2} + \frac{1-q}{2n}\sum_{i=1}^{n}\Vert f(x_i)-y_i\Vert_{V}^{2}. \end{equation*} In this paper we show that for each there exists a two layer network where the first layer has functions which are Riesz representations in the Hilbert space of point evaluation functionals and the second layer is a weighted sum of the first layer, such that the functions realized by these networks satisfy \begin{equation*} \Vert f_{k}-f^*\Vert_{H}^{2} \leq \Bigl( o(1) + \frac{C}{q^2} E\bigl[ \Vert Du_{I}(f^*)\Vert_{H^{*}}^{2} \bigr] \Bigr)\frac{1}{k}. \end{equation*} %Let us note that do not need to be in a linear space and are in a possibly infinite dimensional Hilbert space . %The error estimate is independent of the data size and in the case is finite dimensional %the error estimate is also independent of the dimension of . By choosing the Hilbert space appropriately, the computational complexity of evaluating the Riesz representations of point evaluations might be small and thus the network has low computational complexity.
View on arXiv