ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.08734
  4. Cited By
On the Spectral Bias of Neural Networks

On the Spectral Bias of Neural Networks

22 June 2018
Nasim Rahaman
A. Baratin
Devansh Arpit
Felix Dräxler
Min-Bin Lin
Fred Hamprecht
Yoshua Bengio
Aaron Courville
ArXivPDFHTML

Papers citing "On the Spectral Bias of Neural Networks"

19 / 269 papers shown
Title
Learning Implicit Surface Light Fields
Learning Implicit Surface Light Fields
Michael Oechsle
Michael Niemeyer
L. Mescheder
Thilo Strauss
Andreas Geiger
3DH
3DV
37
43
0
27 Mar 2020
Frequency Bias in Neural Networks for Input of Non-Uniform Density
Frequency Bias in Neural Networks for Input of Non-Uniform Density
Ronen Basri
Meirav Galun
Amnon Geifman
David Jacobs
Yoni Kasten
S. Kritchman
34
182
0
10 Mar 2020
Coherent Gradients: An Approach to Understanding Generalization in
  Gradient Descent-based Optimization
Coherent Gradients: An Approach to Understanding Generalization in Gradient Descent-based Optimization
S. Chatterjee
ODL
OOD
11
48
0
25 Feb 2020
Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural
  Networks
Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks
Blake Bordelon
Abdulkadir Canatar
C. Pehlevan
144
201
0
07 Feb 2020
Towards Understanding the Spectral Bias of Deep Learning
Towards Understanding the Spectral Bias of Deep Learning
Yuan Cao
Zhiying Fang
Yue Wu
Ding-Xuan Zhou
Quanquan Gu
32
214
0
03 Dec 2019
Multi-scale Deep Neural Networks for Solving High Dimensional PDEs
Multi-scale Deep Neural Networks for Solving High Dimensional PDEs
Wei Cai
Zhi-Qin John Xu
AI4CE
12
38
0
25 Oct 2019
Neural Spectrum Alignment: Empirical Study
Neural Spectrum Alignment: Empirical Study
Dmitry Kopitkov
Vadim Indelman
27
14
0
19 Oct 2019
Neural networks are a priori biased towards Boolean functions with low
  entropy
Neural networks are a priori biased towards Boolean functions with low entropy
Chris Mingard
Joar Skalse
Guillermo Valle Pérez
David Martínez-Rubio
Vladimir Mikulik
A. Louis
FAtt
AI4CE
14
37
0
25 Sep 2019
DeepXDE: A deep learning library for solving differential equations
DeepXDE: A deep learning library for solving differential equations
Lu Lu
Xuhui Meng
Zhiping Mao
George Karniadakis
PINN
AI4CE
25
1,485
0
10 Jul 2019
Theory of the Frequency Principle for General Deep Neural Networks
Theory of the Frequency Principle for General Deep Neural Networks
Tao Luo
Zheng Ma
Zhi-Qin John Xu
Yaoyu Zhang
18
78
0
21 Jun 2019
Implicit Regularization in Deep Matrix Factorization
Implicit Regularization in Deep Matrix Factorization
Sanjeev Arora
Nadav Cohen
Wei Hu
Yuping Luo
AI4CE
26
491
0
31 May 2019
A type of generalization error induced by initialization in deep neural
  networks
A type of generalization error induced by initialization in deep neural networks
Yaoyu Zhang
Zhi-Qin John Xu
Tao Luo
Zheng Ma
9
49
0
19 May 2019
On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based
  Models
On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models
Erik Nijkamp
Mitch Hill
Tian Han
Song-Chun Zhu
Ying Nian Wu
19
151
0
29 Mar 2019
Gradient Descent with Early Stopping is Provably Robust to Label Noise
  for Overparameterized Neural Networks
Gradient Descent with Early Stopping is Provably Robust to Label Noise for Overparameterized Neural Networks
Mingchen Li
Mahdi Soltanolkotabi
Samet Oymak
NoLa
47
351
0
27 Mar 2019
Stiffness: A New Perspective on Generalization in Neural Networks
Stiffness: A New Perspective on Generalization in Neural Networks
Stanislav Fort
Pawel Krzysztof Nowak
Stanislaw Jastrzebski
S. Narayanan
19
94
0
28 Jan 2019
Regularization by architecture: A deep prior approach for inverse
  problems
Regularization by architecture: A deep prior approach for inverse problems
Sören Dittmer
T. Kluth
Peter Maass
Daniel Otero Baguer
35
97
0
10 Dec 2018
Understanding training and generalization in deep learning by Fourier
  analysis
Understanding training and generalization in deep learning by Fourier analysis
Zhi-Qin John Xu
AI4CE
19
92
0
13 Aug 2018
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
308
2,890
0
15 Sep 2016
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
148
602
0
14 Feb 2016
Previous
123456