ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1809.00368
83
1
v1v2v3v4v5v6 (latest)

Overcoming the Curse of Dimensionality in Neural Networks

2 September 2018
Karen Yeressian
ArXiv (abs)PDFHTML
Abstract

Let AAA be a set and VVV a real Hilbert space. Let HHH be a real Hilbert space of functions f:A→Vf:A\to Vf:A→V and assume HHH is continuously embedded in the Banach space of bounded functions. For i=1,⋯ ,ni=1,\cdots,ni=1,⋯,n, let (xi,yi)∈A×V(x_i,y_i)\in A\times V(xi​,yi​)∈A×V comprise our dataset. Let 0<q<10<q<10<q<1 and f∗∈Hf^*\in Hf∗∈H be the unique global minimizer of the functional \begin{equation*} u(f) = \frac{q}{2}\Vert f\Vert_{H}^{2} + \frac{1-q}{2n}\sum_{i=1}^{n}\Vert f(x_i)-y_i\Vert_{V}^{2}. \end{equation*} In this paper we show that for each k∈Nk\in\mathbb{N}k∈N there exists a two layer network where the first layer has kkk functions which are Riesz representations in the Hilbert space HHH of point evaluation functionals and the second layer is a weighted sum of the first layer, such that the functions fkf_kfk​ realized by these networks satisfy \begin{equation*} \Vert f_{k}-f^*\Vert_{H}^{2} \leq \Bigl( o(1) + \frac{C}{q^2} E\bigl[ \Vert Du_{I}(f^*)\Vert_{H^{*}}^{2} \bigr] \Bigr)\frac{1}{k}. \end{equation*} %Let us note that xix_ixi​ do not need to be in a linear space and yiy_iyi​ are in a possibly infinite dimensional Hilbert space VVV. %The error estimate is independent of the data size nnn and in the case VVV is finite dimensional %the error estimate is also independent of the dimension of VVV. By choosing the Hilbert space HHH appropriately, the computational complexity of evaluating the Riesz representations of point evaluations might be small and thus the network has low computational complexity.

View on arXiv
Comments on this paper