Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.11361
Cited By
Chaotic Regularization and Heavy-Tailed Limits for Deterministic Gradient Descent
23 May 2022
S. H. Lim
Yijun Wan
Umut cSimcsekli
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Chaotic Regularization and Heavy-Tailed Limits for Deterministic Gradient Descent"
7 / 7 papers shown
Title
Generalization Guarantees for Multi-View Representation Learning and Application to Regularization via Gaussian Product Mixture Prior
Milad Sefidgaran
Abdellatif Zaidi
Piotr Krasnowski
46
0
0
25 Apr 2025
Generalization Guarantees for Representation Learning via Data-Dependent Gaussian Mixture Priors
Milad Sefidgaran
A. Zaidi
Piotr Krasnowski
91
1
0
21 Feb 2025
Privacy of SGD under Gaussian or Heavy-Tailed Noise: Guarantees without Gradient Clipping
Umut Simsekli
Mert Gurbuzbalaban
S. Yıldırım
Lingjiong Zhu
38
2
0
04 Mar 2024
From Stability to Chaos: Analyzing Gradient Descent Dynamics in Quadratic Regression
Xuxing Chen
Krishnakumar Balasubramanian
Promit Ghosal
Bhavya Agrawalla
33
7
0
02 Oct 2023
Algorithmic Stability of Heavy-Tailed SGD with General Loss Functions
Anant Raj
Lingjiong Zhu
Mert Gurbuzbalaban
Umut Simsekli
31
15
0
27 Jan 2023
Stochastic Training is Not Necessary for Generalization
Jonas Geiping
Micah Goldblum
Phillip E. Pope
Michael Moeller
Tom Goldstein
89
72
0
29 Sep 2021
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
308
2,890
0
15 Sep 2016
1