Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2301.02268
Cited By
Restarts subject to approximate sharpness: A parameter-free and optimal scheme for first-order methods
5 January 2023
Ben Adcock
Matthew J. Colbrook
Maksym Neyra-Nesterenko
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Restarts subject to approximate sharpness: A parameter-free and optimal scheme for first-order methods"
5 / 5 papers shown
Title
Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
42
4
0
04 Apr 2024
Implicit regularization in AI meets generalized hardness of approximation in optimization -- Sharp results for diagonal linear networks
J. S. Wind
Vegard Antun
A. Hansen
25
4
0
13 Jul 2023
Acceleration Methods
Alexandre d’Aspremont
Damien Scieur
Adrien B. Taylor
149
97
0
23 Jan 2021
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
139
1,199
0
16 Aug 2016
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
108
1,154
0
04 Mar 2015
1