ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1805.07883
  4. Cited By
How Many Samples are Needed to Estimate a Convolutional or Recurrent
  Neural Network?

How Many Samples are Needed to Estimate a Convolutional or Recurrent Neural Network?

21 May 2018
S. Du
Yining Wang
Xiyu Zhai
Sivaraman Balakrishnan
Ruslan Salakhutdinov
Aarti Singh
    SSL
ArXivPDFHTML

Papers citing "How Many Samples are Needed to Estimate a Convolutional or Recurrent Neural Network?"

14 / 14 papers shown
Title
Theoretical Analysis of Inductive Biases in Deep Convolutional Networks
Theoretical Analysis of Inductive Biases in Deep Convolutional Networks
Zihao Wang
Lei Wu
23
19
0
15 May 2023
Graph CNN for Moving Object Detection in Complex Environments from
  Unseen Videos
Graph CNN for Moving Object Detection in Complex Environments from Unseen Videos
Jhony H. Giraldo
S. Javed
Naoufel Werghi
T. Bouwmans
27
26
0
13 Jul 2022
A New Measure of Model Redundancy for Compressed Convolutional Neural
  Networks
A New Measure of Model Redundancy for Compressed Convolutional Neural Networks
Feiqing Huang
Yuefeng Si
Yao Zheng
Guodong Li
39
1
0
09 Dec 2021
Visualizing the Emergence of Intermediate Visual Patterns in DNNs
Visualizing the Emergence of Intermediate Visual Patterns in DNNs
Mingjie Li
Shaobo Wang
Quanshi Zhang
27
11
0
05 Nov 2021
A Sparse Coding Interpretation of Neural Networks and Theoretical
  Implications
A Sparse Coding Interpretation of Neural Networks and Theoretical Implications
Joshua Bowren
FAtt
32
1
0
14 Aug 2021
ResMLP: Feedforward networks for image classification with
  data-efficient training
ResMLP: Feedforward networks for image classification with data-efficient training
Hugo Touvron
Piotr Bojanowski
Mathilde Caron
Matthieu Cord
Alaaeldin El-Nouby
...
Gautier Izacard
Armand Joulin
Gabriel Synnaeve
Jakob Verbeek
Hervé Jégou
VLM
30
656
0
07 May 2021
Learning Graph Neural Networks with Approximate Gradient Descent
Learning Graph Neural Networks with Approximate Gradient Descent
Qunwei Li
Shaofeng Zou
Leon Wenliang Zhong
GNN
32
1
0
07 Dec 2020
Why Are Convolutional Nets More Sample-Efficient than Fully-Connected
  Nets?
Why Are Convolutional Nets More Sample-Efficient than Fully-Connected Nets?
Zhiyuan Li
Yi Zhang
Sanjeev Arora
BDL
MLT
8
39
0
16 Oct 2020
A Revision of Neural Tangent Kernel-based Approaches for Neural Networks
Kyungsu Kim
A. Lozano
Eunho Yang
AAML
27
0
0
02 Jul 2020
Generalization bounds for deep convolutional neural networks
Generalization bounds for deep convolutional neural networks
Philip M. Long
Hanie Sedghi
MLT
42
89
0
29 May 2019
Prognostic Value of Transfer Learning Based Features in Resectable
  Pancreatic Ductal Adenocarcinoma
Prognostic Value of Transfer Learning Based Features in Resectable Pancreatic Ductal Adenocarcinoma
Yucheng Zhang
Edrise M. Lobo-Mueller
P. Karanicolas
S. Gallinger
M. Haider
Farzad Khalvati
MedIm
15
10
0
23 May 2019
Fine-Grained Analysis of Optimization and Generalization for
  Overparameterized Two-Layer Neural Networks
Fine-Grained Analysis of Optimization and Generalization for Overparameterized Two-Layer Neural Networks
Sanjeev Arora
S. Du
Wei Hu
Zhiyuan Li
Ruosong Wang
MLT
37
962
0
24 Jan 2019
PAC-Bayesian Margin Bounds for Convolutional Neural Networks
PAC-Bayesian Margin Bounds for Convolutional Neural Networks
Konstantinos Pitas
Mike Davies
P. Vandergheynst
BDL
46
12
0
30 Dec 2017
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
183
1,185
0
30 Nov 2014
1