Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2202.00992
Cited By
Tight Convergence Rate Bounds for Optimization Under Power Law Spectral Conditions
2 February 2022
Maksim Velikanov
Dmitry Yarotsky
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Tight Convergence Rate Bounds for Optimization Under Power Law Spectral Conditions"
5 / 5 papers shown
Title
SGD with memory: fundamental properties and stochastic acceleration
Dmitry Yarotsky
Maksim Velikanov
40
1
0
05 Oct 2024
Near-Interpolators: Rapid Norm Growth and the Trade-Off between Interpolation and Generalization
Yutong Wang
Rishi Sonthalia
Wei Hu
46
5
0
12 Mar 2024
Approximation Results for Gradient Descent trained Neural Networks
G. Welper
48
0
0
09 Sep 2023
Approximation results for Gradient Descent trained Shallow Neural Networks in
1
d
1d
1
d
R. Gentile
G. Welper
ODL
58
6
0
17 Sep 2022
A view of mini-batch SGD via generating functions: conditions of convergence, phase transitions, benefit from negative momenta
Maksim Velikanov
Denis Kuznedelev
Dmitry Yarotsky
18
8
0
22 Jun 2022
1