ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.14820
  4. Cited By
Gradient Descent-Ascent Provably Converges to Strict Local Minmax
  Equilibria with a Finite Timescale Separation

Gradient Descent-Ascent Provably Converges to Strict Local Minmax Equilibria with a Finite Timescale Separation

30 September 2020
Tanner Fiez
Lillian J. Ratliff
ArXivPDFHTML

Papers citing "Gradient Descent-Ascent Provably Converges to Strict Local Minmax Equilibria with a Finite Timescale Separation"

4 / 4 papers shown
Title
Escaping limit cycles: Global convergence for constrained
  nonconvex-nonconcave minimax problems
Escaping limit cycles: Global convergence for constrained nonconvex-nonconcave minimax problems
Thomas Pethick
P. Latafat
Panagiotis Patrinos
Olivier Fercoq
V. Cevher
38
45
0
20 Feb 2023
Closed-Loop Data Transcription to an LDR via Minimaxing Rate Reduction
Closed-Loop Data Transcription to an LDR via Minimaxing Rate Reduction
Xili Dai
Shengbang Tong
Mingyang Li
Ziyang Wu
Michael Psenka
...
Pengyuan Zhai
Yaodong Yu
Xiaojun Yuan
Harry Shum
Yi Ma
25
30
0
12 Nov 2021
The limits of min-max optimization algorithms: convergence to spurious
  non-critical sets
The limits of min-max optimization algorithms: convergence to spurious non-critical sets
Ya-Ping Hsieh
P. Mertikopoulos
V. Cevher
32
81
0
16 Jun 2020
On Solving Minimax Optimization Locally: A Follow-the-Ridge Approach
On Solving Minimax Optimization Locally: A Follow-the-Ridge Approach
Yuanhao Wang
Guodong Zhang
Jimmy Ba
33
100
0
16 Oct 2019
1