Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.12493
Cited By
Optimization with Momentum: Dynamical, Control-Theoretic, and Symplectic Perspectives
28 February 2020
Michael Muehlebach
Michael I. Jordan
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Optimization with Momentum: Dynamical, Control-Theoretic, and Symplectic Perspectives"
12 / 12 papers shown
Title
Accelerated First-Order Optimization under Nonlinear Constraints
Michael Muehlebach
Michael I. Jordan
73
3
0
01 Feb 2023
A Dynamical Systems Perspective on Nesterov Acceleration
Michael Muehlebach
Michael I. Jordan
37
120
0
17 May 2019
Conformal Symplectic and Relativistic Optimization
G. Francca
Jeremias Sulam
Daniel P. Robinson
René Vidal
21
69
0
11 Mar 2019
On Nonconvex Optimization for Machine Learning: Gradients, Stochasticity, and Saddle Points
Chi Jin
Praneeth Netrapalli
Rong Ge
Sham Kakade
Michael I. Jordan
65
61
0
13 Feb 2019
On Symplectic Optimization
M. Betancourt
Michael I. Jordan
Ashia Wilson
106
94
0
10 Feb 2018
Accelerated Gradient Descent Escapes Saddle Points Faster than Gradient Descent
Chi Jin
Praneeth Netrapalli
Michael I. Jordan
ODL
56
261
0
28 Nov 2017
Gradient Descent Can Take Exponential Time to Escape Saddle Points
S. Du
Chi Jin
Jason D. Lee
Michael I. Jordan
Barnabás Póczós
Aarti Singh
42
244
0
29 May 2017
Stochastic Heavy Ball
S. Gadat
Fabien Panloup
Sofiane Saadane
78
102
0
14 Sep 2016
A Variational Perspective on Accelerated Methods in Optimization
Andre Wibisono
Ashia Wilson
Michael I. Jordan
74
572
0
14 Mar 2016
A geometric alternative to Nesterov's accelerated gradient descent
Sébastien Bubeck
Y. Lee
Mohit Singh
ODL
28
166
0
26 Jun 2015
Escaping From Saddle Points --- Online Stochastic Gradient for Tensor Decomposition
Rong Ge
Furong Huang
Chi Jin
Yang Yuan
112
1,056
0
06 Mar 2015
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
149
1,161
0
04 Mar 2015
1