ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.00957
  4. Cited By
On Convergence of Gradient Descent Ascent: A Tight Local Analysis

On Convergence of Gradient Descent Ascent: A Tight Local Analysis

3 July 2022
Haochuan Li
Farzan Farnia
Subhro Das
Ali Jadbabaie
ArXivPDFHTML

Papers citing "On Convergence of Gradient Descent Ascent: A Tight Local Analysis"

5 / 5 papers shown
Title
Negative Stepsizes Make Gradient-Descent-Ascent Converge
Negative Stepsizes Make Gradient-Descent-Ascent Converge
Henry Shugart
Jason M. Altschuler
25
0
0
02 May 2025
Two-Timescale Gradient Descent Ascent Algorithms for Nonconvex Minimax Optimization
Two-Timescale Gradient Descent Ascent Algorithms for Nonconvex Minimax Optimization
Tianyi Lin
Chi Jin
Michael I. Jordan
52
6
0
28 Jan 2025
TiAda: A Time-scale Adaptive Algorithm for Nonconvex Minimax
  Optimization
TiAda: A Time-scale Adaptive Algorithm for Nonconvex Minimax Optimization
Xiang Li
Junchi Yang
Niao He
26
8
0
31 Oct 2022
Nest Your Adaptive Algorithm for Parameter-Agnostic Nonconvex Minimax
  Optimization
Nest Your Adaptive Algorithm for Parameter-Agnostic Nonconvex Minimax Optimization
Junchi Yang
Xiang Li
Niao He
ODL
27
22
0
01 Jun 2022
Linear Convergence of the Primal-Dual Gradient Method for Convex-Concave
  Saddle Point Problems without Strong Convexity
Linear Convergence of the Primal-Dual Gradient Method for Convex-Concave Saddle Point Problems without Strong Convexity
S. Du
Wei Hu
58
120
0
05 Feb 2018
1