Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.04861
Cited By
Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent
12 February 2020
David Holzmüller
Ingo Steinwart
MLT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent"
3 / 3 papers shown
Title
Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension
Moritz Haas
David Holzmüller
U. V. Luxburg
Ingo Steinwart
MLT
35
14
0
23 May 2023
Persistent Neurons
Yimeng Min
29
0
0
02 Jul 2020
Critical Point-Finding Methods Reveal Gradient-Flat Regions of Deep Network Losses
Charles G. Frye
James B. Simon
Neha S. Wadia
A. Ligeralde
M. DeWeese
K. Bouchard
ODL
16
2
0
23 Mar 2020
1