ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.12101
  4. Cited By
Potential Function-based Framework for Making the Gradients Small in
  Convex and Min-Max Optimization

Potential Function-based Framework for Making the Gradients Small in Convex and Min-Max Optimization

28 January 2021
Jelena Diakonikolas
Puqian Wang
ArXivPDFHTML

Papers citing "Potential Function-based Framework for Making the Gradients Small in Convex and Min-Max Optimization"

3 / 3 papers shown
Title
Extragradient Method: $O(1/K)$ Last-Iterate Convergence for Monotone
  Variational Inequalities and Connections With Cocoercivity
Extragradient Method: O(1/K)O(1/K)O(1/K) Last-Iterate Convergence for Monotone Variational Inequalities and Connections With Cocoercivity
Eduard A. Gorbunov
Nicolas Loizou
Gauthier Gidel
41
64
0
08 Oct 2021
The Complexity of Nonconvex-Strongly-Concave Minimax Optimization
The Complexity of Nonconvex-Strongly-Concave Minimax Optimization
Siqi Zhang
Junchi Yang
Cristóbal Guzmán
Negar Kiyavash
Niao He
38
61
0
29 Mar 2021
A Differential Equation for Modeling Nesterov's Accelerated Gradient
  Method: Theory and Insights
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
129
1,159
0
04 Mar 2015
1