ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.05791
  4. Cited By
The Power of Adaptivity in SGD: Self-Tuning Step Sizes with Unbounded
  Gradients and Affine Variance

The Power of Adaptivity in SGD: Self-Tuning Step Sizes with Unbounded Gradients and Affine Variance

11 February 2022
Matthew Faw
Isidoros Tziotis
C. Caramanis
Aryan Mokhtari
Sanjay Shakkottai
Rachel A. Ward
ArXivPDFHTML

Papers citing "The Power of Adaptivity in SGD: Self-Tuning Step Sizes with Unbounded Gradients and Affine Variance"

15 / 15 papers shown
Title
Faster Stochastic Optimization with Arbitrary Delays via Asynchronous
  Mini-Batching
Faster Stochastic Optimization with Arbitrary Delays via Asynchronous Mini-Batching
Amit Attia
Ofir Gaash
Tomer Koren
40
0
0
14 Aug 2024
An Adaptive Stochastic Gradient Method with Non-negative Gauss-Newton
  Stepsizes
An Adaptive Stochastic Gradient Method with Non-negative Gauss-Newton Stepsizes
Antonio Orvieto
Lin Xiao
47
3
0
05 Jul 2024
Convergence Guarantees for RMSProp and Adam in Generalized-smooth Non-convex Optimization with Affine Noise Variance
Convergence Guarantees for RMSProp and Adam in Generalized-smooth Non-convex Optimization with Affine Noise Variance
Qi Zhang
Yi Zhou
Shaofeng Zou
42
4
0
01 Apr 2024
Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad
Remove that Square Root: A New Efficient Scale-Invariant Version of AdaGrad
Sayantan Choudhury
N. Tupitsa
Nicolas Loizou
Samuel Horváth
Martin Takáč
Eduard A. Gorbunov
40
1
0
05 Mar 2024
A General Reduction for High-Probability Analysis with General Light-Tailed Distributions
A General Reduction for High-Probability Analysis with General Light-Tailed Distributions
Amit Attia
Tomer Koren
28
1
0
05 Mar 2024
Tuning-Free Stochastic Optimization
Tuning-Free Stochastic Optimization
Ahmed Khaled
Chi Jin
32
7
0
12 Feb 2024
On Convergence of Adam for Stochastic Optimization under Relaxed Assumptions
On Convergence of Adam for Stochastic Optimization under Relaxed Assumptions
Yusu Hong
Junhong Lin
48
12
0
06 Feb 2024
How Free is Parameter-Free Stochastic Optimization?
How Free is Parameter-Free Stochastic Optimization?
Amit Attia
Tomer Koren
ODL
47
5
0
05 Feb 2024
Convergence of AdaGrad for Non-convex Objectives: Simple Proofs and
  Relaxed Assumptions
Convergence of AdaGrad for Non-convex Objectives: Simple Proofs and Relaxed Assumptions
Bo Wang
Huishuai Zhang
Zhirui Ma
Wei Chen
40
50
0
29 May 2023
Two Sides of One Coin: the Limits of Untuned SGD and the Power of
  Adaptive Methods
Two Sides of One Coin: the Limits of Untuned SGD and the Power of Adaptive Methods
Junchi Yang
Xiang Li
Ilyas Fatkhullin
Niao He
42
15
0
21 May 2023
Stochastic Nonsmooth Convex Optimization with Heavy-Tailed Noises:
  High-Probability Bound, In-Expectation Rate and Initial Distance Adaptation
Stochastic Nonsmooth Convex Optimization with Heavy-Tailed Noises: High-Probability Bound, In-Expectation Rate and Initial Distance Adaptation
Zijian Liu
Zhengyuan Zhou
30
10
0
22 Mar 2023
SGD with AdaGrad Stepsizes: Full Adaptivity with High Probability to
  Unknown Parameters, Unbounded Gradients and Affine Variance
SGD with AdaGrad Stepsizes: Full Adaptivity with High Probability to Unknown Parameters, Unbounded Gradients and Affine Variance
Amit Attia
Tomer Koren
ODL
22
26
0
17 Feb 2023
DoG is SGD's Best Friend: A Parameter-Free Dynamic Step Size Schedule
DoG is SGD's Best Friend: A Parameter-Free Dynamic Step Size Schedule
Maor Ivgi
Oliver Hinder
Y. Carmon
ODL
28
57
0
08 Feb 2023
Adaptive Stochastic Variance Reduction for Non-convex Finite-Sum
  Minimization
Adaptive Stochastic Variance Reduction for Non-convex Finite-Sum Minimization
Ali Kavis
Stratis Skoulakis
Kimon Antonakopoulos
L. Dadi
V. Cevher
27
15
0
03 Nov 2022
A High Probability Analysis of Adaptive SGD with Momentum
A High Probability Analysis of Adaptive SGD with Momentum
Xiaoyun Li
Francesco Orabona
92
66
0
28 Jul 2020
1