ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.17275
16
0

Local Convergence of Gradient Methods for Min-Max Games: Partial Curvature Generically Suffices

26 May 2023
Guillaume Wang
Lénaïc Chizat
ArXivPDFHTML
Abstract

We study the convergence to local Nash equilibria of gradient methods for two-player zero-sum differentiable games. It is well-known that such dynamics converge locally when S≻0S \succ 0S≻0 and may diverge when S=0S=0S=0, where S⪰0S\succeq 0S⪰0 is the symmetric part of the Jacobian at equilibrium that accounts for the "potential" component of the game. We show that these dynamics also converge as soon as SSS is nonzero (partial curvature) and the eigenvectors of the antisymmetric part AAA are in general position with respect to the kernel of SSS. We then study the convergence rates when S≪AS \ll AS≪A and prove that they typically depend on the average of the eigenvalues of SSS, instead of the minimum as an analogy with minimization problems would suggest. To illustrate our results, we consider the problem of computing mixed Nash equilibria of continuous games. We show that, thanks to partial curvature, conic particle methods -- which optimize over both weights and supports of the mixed strategies -- generically converge faster than fixed-support methods. For min-max games, it is thus beneficial to add degrees of freedom "with curvature": this can be interpreted as yet another benefit of over-parameterization.

View on arXiv
Comments on this paper