Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2307.11782
Cited By
Convergence of Adam for Non-convex Objectives: Relaxed Hyperparameters and Non-ergodic Case
20 July 2023
Meixuan He
Yuqing Liang
Jinlan Liu
Dongpo Xu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Convergence of Adam for Non-convex Objectives: Relaxed Hyperparameters and Non-ergodic Case"
9 / 9 papers shown
Title
Sharp higher order convergence rates for the Adam optimizer
Steffen Dereich
Arnulf Jentzen
Adrian Riekert
ODL
61
0
0
28 Apr 2025
LDAdam: Adaptive Optimization from Low-Dimensional Gradient Statistics
Thomas Robert
M. Safaryan
Ionut-Vlad Modoranu
Dan Alistarh
ODL
36
2
0
21 Oct 2024
An Attention-Based Algorithm for Gravity Adaptation Zone Calibration
Chen Yu
21
0
0
06 Oct 2024
MicroAdam: Accurate Adaptive Optimization with Low Space Overhead and Provable Convergence
Ionut-Vlad Modoranu
M. Safaryan
Grigory Malinovsky
Eldar Kurtic
Thomas Robert
Peter Richtárik
Dan Alistarh
MQ
45
12
0
24 May 2024
On the Convergence of Adam under Non-uniform Smoothness: Separability from SGDM and Beyond
Bohan Wang
Huishuai Zhang
Qi Meng
Ruoyu Sun
Zhi-Ming Ma
Wei Chen
37
7
0
22 Mar 2024
Stochastic Gradient Descent with Dependent Data for Offline Reinforcement Learning
Jing-rong Dong
Xin T. Tong
OffRL
32
2
0
06 Feb 2022
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
56
144
0
05 Mar 2020
New Convergence Aspects of Stochastic Gradient Algorithms
Lam M. Nguyen
Phuong Ha Nguyen
Peter Richtárik
K. Scheinberg
Martin Takáč
Marten van Dijk
23
66
0
10 Nov 2018
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
139
1,201
0
16 Aug 2016
1