ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.00740
200
577
v1v2v3v4v5 (latest)

Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality: a Review

2 November 2016
T. Poggio
H. Mhaskar
Lorenzo Rosasco
Brando Miranda
Q. Liao
ArXiv (abs)PDFHTML
Abstract

The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.

View on arXiv
Comments on this paper