ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.20209
  4. Cited By
Characterizing Dynamical Stability of Stochastic Gradient Descent in
  Overparameterized Learning

Characterizing Dynamical Stability of Stochastic Gradient Descent in Overparameterized Learning

29 July 2024
Dennis Chemnitz
Maximilian Engel
ArXivPDFHTML

Papers citing "Characterizing Dynamical Stability of Stochastic Gradient Descent in Overparameterized Learning"

2 / 2 papers shown
Title
Understanding Gradient Descent on Edge of Stability in Deep Learning
Understanding Gradient Descent on Edge of Stability in Deep Learning
Sanjeev Arora
Zhiyuan Li
A. Panigrahi
MLT
80
89
0
19 May 2022
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,889
0
15 Sep 2016
1