Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1809.02864
Cited By
Online Adaptive Methods, Universality and Acceleration
8 September 2018
Kfir Y. Levy
A. Yurtsever
V. Cevher
ODL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Online Adaptive Methods, Universality and Acceleration"
22 / 22 papers shown
Title
A simple uniformly optimal method without line search for convex optimization
Tianjiao Li
Guanghui Lan
26
20
0
16 Oct 2023
Distributed Extra-gradient with Optimal Complexity and Communication Guarantees
Ali Ramezani-Kebrya
Kimon Antonakopoulos
Igor Krawczuk
Justin Deschenaux
V. Cevher
36
2
0
17 Aug 2023
Differentially Private Adaptive Optimization with Delayed Preconditioners
Tian Li
Manzil Zaheer
Ziyu Liu
Sashank J. Reddi
H. B. McMahan
Virginia Smith
42
10
0
01 Dec 2022
Extra-Newton: A First Approach to Noise-Adaptive Accelerated Second-Order Methods
Kimon Antonakopoulos
Ali Kavis
V. Cevher
ODL
23
12
0
03 Nov 2022
TiAda: A Time-scale Adaptive Algorithm for Nonconvex Minimax Optimization
Xiang Li
Junchi Yang
Niao He
26
8
0
31 Oct 2022
Optimistic Optimisation of Composite Objective with Exponentiated Update
Weijia Shao
F. Sivrikaya
S. Albayrak
23
3
0
08 Aug 2022
Improved Policy Optimization for Online Imitation Learning
J. Lavington
Sharan Vaswani
Mark W. Schmidt
OffRL
18
6
0
29 Jul 2022
Nest Your Adaptive Algorithm for Parameter-Agnostic Nonconvex Minimax Optimization
Junchi Yang
Xiang Li
Niao He
ODL
27
22
0
01 Jun 2022
Exploiting the Curvature of Feasible Sets for Faster Projection-Free Online Learning
Zakaria Mhammedi
10
8
0
23 May 2022
High Probability Bounds for a Class of Nonconvex Algorithms with AdaGrad Stepsize
Ali Kavis
Kfir Y. Levy
V. Cevher
19
38
0
06 Apr 2022
Towards Noise-adaptive, Problem-adaptive (Accelerated) Stochastic Gradient Descent
Sharan Vaswani
Benjamin Dubois-Taine
Reza Babanezhad
51
11
0
21 Oct 2021
Adaptive Differentially Private Empirical Risk Minimization
Xiaoxia Wu
Lingxiao Wang
Irina Cristali
Quanquan Gu
Rebecca Willett
38
6
0
14 Oct 2021
SVRG Meets AdaGrad: Painless Variance Reduction
Benjamin Dubois-Taine
Sharan Vaswani
Reza Babanezhad
Mark W. Schmidt
Simon Lacoste-Julien
18
18
0
18 Feb 2021
First-Order Methods for Convex Optimization
Pavel Dvurechensky
Mathias Staudigl
Shimrit Shtern
ODL
28
25
0
04 Jan 2021
Sequential convergence of AdaGrad algorithm for smooth convex optimization
Cheik Traoré
Edouard Pauwels
14
21
0
24 Nov 2020
Adaptive extra-gradient methods for min-max optimization and games
Kimon Antonakopoulos
E. V. Belmega
P. Mertikopoulos
54
46
0
22 Oct 2020
Adaptive Gradient Methods for Constrained Convex Optimization and Variational Inequalities
Alina Ene
Huy Le Nguyen
Adrian Vladu
ODL
30
28
0
17 Jul 2020
Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization
Samuel Horváth
Lihua Lei
Peter Richtárik
Michael I. Jordan
57
30
0
13 Feb 2020
UniXGrad: A Universal, Adaptive Algorithm with Optimal Guarantees for Constrained Optimization
Ali Kavis
Kfir Y. Levy
Francis R. Bach
V. Cevher
ODL
6
56
0
30 Oct 2019
Anytime Online-to-Batch Conversions, Optimism, and Acceleration
Ashok Cutkosky
13
7
0
03 Mar 2019
On the Convergence of Stochastic Gradient Descent with Adaptive Stepsizes
Xiaoyun Li
Francesco Orabona
37
290
0
21 May 2018
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
108
1,154
0
04 Mar 2015
1