ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.05451
  4. Cited By
The Representation Power of Neural Networks: Breaking the Curse of
  Dimensionality
v1v2v3 (latest)

The Representation Power of Neural Networks: Breaking the Curse of Dimensionality

10 December 2020
Moise Blanchard
M. A. Bennouna
ArXiv (abs)PDFHTML

Papers citing "The Representation Power of Neural Networks: Breaking the Curse of Dimensionality"

4 / 4 papers shown
Title
Why Deep Neural Networks for Function Approximation?
Why Deep Neural Networks for Function Approximation?
Shiyu Liang
R. Srikant
138
385
0
13 Oct 2016
Error bounds for approximations with deep ReLU networks
Error bounds for approximations with deep ReLU networks
Dmitry Yarotsky
198
1,233
0
03 Oct 2016
Why does deep and cheap learning work so well?
Why does deep and cheap learning work so well?
Henry W. Lin
Max Tegmark
David Rolnick
85
610
0
29 Aug 2016
Learning Functions: When Is Deep Better Than Shallow
Learning Functions: When Is Deep Better Than Shallow
H. Mhaskar
Q. Liao
T. Poggio
72
144
0
03 Mar 2016
1