ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.04592
  4. Cited By
Convergence Analysis of Adaptive Gradient Methods under Refined
  Smoothness and Noise Assumptions

Convergence Analysis of Adaptive Gradient Methods under Refined Smoothness and Noise Assumptions

7 June 2024
Devyani Maladkar
Ruichen Jiang
Aryan Mokhtari
ArXivPDFHTML

Papers citing "Convergence Analysis of Adaptive Gradient Methods under Refined Smoothness and Noise Assumptions"

2 / 2 papers shown
Title
A High Probability Analysis of Adaptive SGD with Momentum
A High Probability Analysis of Adaptive SGD with Momentum
Xiaoyun Li
Francesco Orabona
92
65
0
28 Jul 2020
A Simple Convergence Proof of Adam and Adagrad
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
144
0
05 Mar 2020
1