ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1901.00434
  4. Cited By
The capacity of feedforward neural networks
v1v2 (latest)

The capacity of feedforward neural networks

2 January 2019
Pierre Baldi
Roman Vershynin
ArXiv (abs)PDFHTML

Papers citing "The capacity of feedforward neural networks"

7 / 7 papers shown
Title
The Anisotropic Noise in Stochastic Gradient Descent: Its Behavior of
  Escaping from Sharp Minima and Regularization Effects
The Anisotropic Noise in Stochastic Gradient Descent: Its Behavior of Escaping from Sharp Minima and Regularization Effects
Zhanxing Zhu
Jingfeng Wu
Ting Yu
Lei Wu
Jin Ma
41
40
0
01 Mar 2018
Nearly-tight VC-dimension and pseudodimension bounds for piecewise
  linear neural networks
Nearly-tight VC-dimension and pseudodimension bounds for piecewise linear neural networks
Peter L. Bartlett
Nick Harvey
Christopher Liaw
Abbas Mehrabian
213
434
0
08 Mar 2017
Neuromorphic Deep Learning Machines
Neuromorphic Deep Learning Machines
Emre Neftci
C. Augustine
Somnath Paul
Georgios Detorakis
BDL
180
260
0
16 Dec 2016
Understanding deep learning requires rethinking generalization
Understanding deep learning requires rethinking generalization
Chiyuan Zhang
Samy Bengio
Moritz Hardt
Benjamin Recht
Oriol Vinyals
HAI
351
4,636
0
10 Nov 2016
The Power of Depth for Feedforward Neural Networks
The Power of Depth for Feedforward Neural Networks
Ronen Eldan
Ohad Shamir
221
732
0
12 Dec 2015
A Theory of Local Learning, the Learning Channel, and the Optimality of
  Backpropagation
A Theory of Local Learning, the Learning Channel, and the Optimality of Backpropagation
Pierre Baldi
Peter Sadowski
58
72
0
22 Jun 2015
Deep Learning in Neural Networks: An Overview
Deep Learning in Neural Networks: An Overview
Jürgen Schmidhuber
HAI
248
16,378
0
30 Apr 2014
1