ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.00992
  4. Cited By
Tight Convergence Rate Bounds for Optimization Under Power Law Spectral
  Conditions

Tight Convergence Rate Bounds for Optimization Under Power Law Spectral Conditions

2 February 2022
Maksim Velikanov
Dmitry Yarotsky
ArXivPDFHTML

Papers citing "Tight Convergence Rate Bounds for Optimization Under Power Law Spectral Conditions"

5 / 5 papers shown
Title
SGD with memory: fundamental properties and stochastic acceleration
SGD with memory: fundamental properties and stochastic acceleration
Dmitry Yarotsky
Maksim Velikanov
40
1
0
05 Oct 2024
Near-Interpolators: Rapid Norm Growth and the Trade-Off between
  Interpolation and Generalization
Near-Interpolators: Rapid Norm Growth and the Trade-Off between Interpolation and Generalization
Yutong Wang
Rishi Sonthalia
Wei Hu
46
5
0
12 Mar 2024
Approximation Results for Gradient Descent trained Neural Networks
Approximation Results for Gradient Descent trained Neural Networks
G. Welper
48
0
0
09 Sep 2023
Approximation results for Gradient Descent trained Shallow Neural
  Networks in $1d$
Approximation results for Gradient Descent trained Shallow Neural Networks in 1d1d1d
R. Gentile
G. Welper
ODL
58
6
0
17 Sep 2022
A view of mini-batch SGD via generating functions: conditions of
  convergence, phase transitions, benefit from negative momenta
A view of mini-batch SGD via generating functions: conditions of convergence, phase transitions, benefit from negative momenta
Maksim Velikanov
Denis Kuznedelev
Dmitry Yarotsky
18
8
0
22 Jun 2022
1