ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.01867
  4. Cited By
A Theoretical-Empirical Approach to Estimating Sample Complexity of DNNs

A Theoretical-Empirical Approach to Estimating Sample Complexity of DNNs

5 May 2021
Devansh Bisla
Apoorva Nandini Saridena
A. Choromańska
ArXiv (abs)PDFHTML

Papers citing "A Theoretical-Empirical Approach to Estimating Sample Complexity of DNNs"

19 / 19 papers shown
Title
Fantastic Generalization Measures and Where to Find Them
Fantastic Generalization Measures and Where to Find Them
Yiding Jiang
Behnam Neyshabur
H. Mobahi
Dilip Krishnan
Samy Bengio
AI4CE
136
607
0
04 Dec 2019
Learning Curves for Deep Neural Networks: A Gaussian Field Theory
  Perspective
Learning Curves for Deep Neural Networks: A Gaussian Field Theory Perspective
Omry Cohen
Orit Malka
Zohar Ringel
AI4CE
52
22
0
12 Jun 2019
Dimensionality compression and expansion in Deep Neural Networks
Dimensionality compression and expansion in Deep Neural Networks
Stefano Recanatesi
M. Farrell
Madhu S. Advani
Timothy Moore
Guillaume Lajoie
E. Shea-Brown
60
73
0
02 Jun 2019
Asymptotic learning curves of kernel methods: empirical data v.s.
  Teacher-Student paradigm
Asymptotic learning curves of kernel methods: empirical data v.s. Teacher-Student paradigm
S. Spigler
Mario Geiger
Matthieu Wyart
68
38
0
26 May 2019
Estimating the intrinsic dimension of datasets by a minimal neighborhood
  information
Estimating the intrinsic dimension of datasets by a minimal neighborhood information
Elena Facco
M. d’Errico
Alex Rodriguez
Alessandro Laio
49
327
0
19 Mar 2018
Learning Representations for Neural Network-Based Classification Using
  the Information Bottleneck Principle
Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle
Rana Ali Amjad
Bernhard C. Geiger
71
196
0
27 Feb 2018
Stronger generalization bounds for deep nets via a compression approach
Stronger generalization bounds for deep nets via a compression approach
Sanjeev Arora
Rong Ge
Behnam Neyshabur
Yi Zhang
MLTAI4CE
86
642
0
14 Feb 2018
State-of-the-art Speech Recognition With Sequence-to-Sequence Models
State-of-the-art Speech Recognition With Sequence-to-Sequence Models
Chung-Cheng Chiu
Tara N. Sainath
Yonghui Wu
Rohit Prabhavalkar
Patrick Nguyen
...
Katya Gonina
Navdeep Jaitly
Yue Liu
J. Chorowski
M. Bacchiani
AI4TS
89
1,153
0
05 Dec 2017
To prune, or not to prune: exploring the efficacy of pruning for model
  compression
To prune, or not to prune: exploring the efficacy of pruning for model compression
Michael Zhu
Suyog Gupta
194
1,276
0
05 Oct 2017
Revisiting Unreasonable Effectiveness of Data in Deep Learning Era
Revisiting Unreasonable Effectiveness of Data in Deep Learning Era
Chen Sun
Abhinav Shrivastava
Saurabh Singh
Abhinav Gupta
VLM
188
2,401
0
10 Jul 2017
Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural
  Networks with Many More Parameters than Training Data
Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data
Gintare Karolina Dziugaite
Daniel M. Roy
106
815
0
31 Mar 2017
Nearly-tight VC-dimension and pseudodimension bounds for piecewise
  linear neural networks
Nearly-tight VC-dimension and pseudodimension bounds for piecewise linear neural networks
Peter L. Bartlett
Nick Harvey
Christopher Liaw
Abbas Mehrabian
208
432
0
08 Mar 2017
Understanding deep learning requires rethinking generalization
Understanding deep learning requires rethinking generalization
Chiyuan Zhang
Samy Bengio
Moritz Hardt
Benjamin Recht
Oriol Vinyals
HAI
339
4,629
0
10 Nov 2016
Joint Unsupervised Learning of Deep Representations and Image Clusters
Joint Unsupervised Learning of Deep Representations and Image Clusters
Jianwei Yang
Devi Parikh
Dhruv Batra
SSL
54
817
0
13 Apr 2016
Learning both Weights and Connections for Efficient Neural Networks
Learning both Weights and Connections for Efficient Neural Networks
Song Han
Jeff Pool
J. Tran
W. Dally
CVBM
313
6,681
0
08 Jun 2015
In Search of the Real Inductive Bias: On the Role of Implicit
  Regularization in Deep Learning
In Search of the Real Inductive Bias: On the Role of Implicit Regularization in Deep Learning
Behnam Neyshabur
Ryota Tomioka
Nathan Srebro
AI4CE
94
658
0
20 Dec 2014
Neural Machine Translation by Jointly Learning to Align and Translate
Neural Machine Translation by Jointly Learning to Align and Translate
Dzmitry Bahdanau
Kyunghyun Cho
Yoshua Bengio
AIMat
558
27,311
0
01 Sep 2014
Exploiting Linear Structure Within Convolutional Networks for Efficient
  Evaluation
Exploiting Linear Structure Within Convolutional Networks for Efficient Evaluation
Emily L. Denton
Wojciech Zaremba
Joan Bruna
Yann LeCun
Rob Fergus
FAtt
177
1,689
0
02 Apr 2014
Speech Recognition with Deep Recurrent Neural Networks
Speech Recognition with Deep Recurrent Neural Networks
Alex Graves
Abdel-rahman Mohamed
Geoffrey E. Hinton
226
8,517
0
22 Mar 2013
1