ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1712.04581
  4. Cited By
Potential-Function Proofs for First-Order Methods

Potential-Function Proofs for First-Order Methods

13 December 2017
N. Bansal
Anupam Gupta
ArXivPDFHTML

Papers citing "Potential-Function Proofs for First-Order Methods"

6 / 6 papers shown
Title
Stacking as Accelerated Gradient Descent
Stacking as Accelerated Gradient Descent
Naman Agarwal
Pranjal Awasthi
Satyen Kale
Eric Zhao
ODL
110
2
0
20 Feb 2025
Introduction to Online Convex Optimization
Introduction to Online Convex Optimization
Elad Hazan
OffRL
148
1,927
0
07 Sep 2019
Stochastic first-order methods: non-asymptotic and computer-aided
  analyses via potential functions
Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions
Adrien B. Taylor
Francis R. Bach
35
63
0
03 Feb 2019
A Variational Perspective on Accelerated Methods in Optimization
A Variational Perspective on Accelerated Methods in Optimization
Andre Wibisono
Ashia Wilson
Michael I. Jordan
85
573
0
14 Mar 2016
A Differential Equation for Modeling Nesterov's Accelerated Gradient
  Method: Theory and Insights
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
157
1,166
0
04 Mar 2015
Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
Zeyuan Allen-Zhu
L. Orecchia
71
16
0
06 Jul 2014
1