ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.00740
  4. Cited By
Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of
  Dimensionality: a Review

Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality: a Review

2 November 2016
T. Poggio
H. Mhaskar
Lorenzo Rosasco
Brando Miranda
Q. Liao
ArXivPDFHTML

Papers citing "Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality: a Review"

3 / 103 papers shown
Title
Equivalence of restricted Boltzmann machines and tensor network states
Equivalence of restricted Boltzmann machines and tensor network states
Martín Arjovsky
Song Cheng
Haidong Xie
Léon Bottou
Tao Xiang
32
225
0
17 Jan 2017
Depth-Width Tradeoffs in Approximating Natural Functions with Neural
  Networks
Depth-Width Tradeoffs in Approximating Natural Functions with Neural Networks
Itay Safran
Ohad Shamir
41
174
0
31 Oct 2016
Bridging the Gaps Between Residual Learning, Recurrent Neural Networks
  and Visual Cortex
Bridging the Gaps Between Residual Learning, Recurrent Neural Networks and Visual Cortex
Q. Liao
T. Poggio
213
255
0
13 Apr 2016
Previous
123