ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.04861
  4. Cited By
Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent

Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent

12 February 2020
David Holzmüller
Ingo Steinwart
    MLT
ArXivPDFHTML

Papers citing "Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent"

3 / 3 papers shown
Title
Mind the spikes: Benign overfitting of kernels and neural networks in
  fixed dimension
Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension
Moritz Haas
David Holzmüller
U. V. Luxburg
Ingo Steinwart
MLT
35
14
0
23 May 2023
Persistent Neurons
Persistent Neurons
Yimeng Min
29
0
0
02 Jul 2020
Critical Point-Finding Methods Reveal Gradient-Flat Regions of Deep
  Network Losses
Critical Point-Finding Methods Reveal Gradient-Flat Regions of Deep Network Losses
Charles G. Frye
James B. Simon
Neha S. Wadia
A. Ligeralde
M. DeWeese
K. Bouchard
ODL
16
2
0
23 Mar 2020
1