ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.06900
  4. Cited By
Grad-GradaGrad? A Non-Monotone Adaptive Stochastic Gradient Method

Grad-GradaGrad? A Non-Monotone Adaptive Stochastic Gradient Method

14 June 2022
Aaron Defazio
Baoyu Zhou
Lin Xiao
    ODL
ArXivPDFHTML

Papers citing "Grad-GradaGrad? A Non-Monotone Adaptive Stochastic Gradient Method"

5 / 5 papers shown
Title
Adaptive proximal gradient methods are universal without approximation
Adaptive proximal gradient methods are universal without approximation
Konstantinos A. Oikonomidis
Emanuel Laude
P. Latafat
Andreas Themelis
Panagiotis Patrinos
26
6
0
09 Feb 2024
On the convergence of adaptive first order methods: proximal gradient
  and alternating minimization algorithms
On the convergence of adaptive first order methods: proximal gradient and alternating minimization algorithms
P. Latafat
Andreas Themelis
Panagiotis Patrinos
32
11
0
30 Nov 2023
Adaptive Federated Learning with Auto-Tuned Clients
Adaptive Federated Learning with Auto-Tuned Clients
J. Kim
Taha Toghani
César A. Uribe
Anastasios Kyrillidis
FedML
45
6
0
19 Jun 2023
Searching for Optimal Per-Coordinate Step-sizes with Multidimensional
  Backtracking
Searching for Optimal Per-Coordinate Step-sizes with Multidimensional Backtracking
Frederik Kunstner
V. S. Portella
Mark W. Schmidt
Nick Harvey
26
8
0
05 Jun 2023
Adaptive proximal algorithms for convex optimization under local
  Lipschitz continuity of the gradient
Adaptive proximal algorithms for convex optimization under local Lipschitz continuity of the gradient
P. Latafat
Andreas Themelis
L. Stella
Panagiotis Patrinos
27
26
0
11 Jan 2023
1