ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.08573
34
7

Optimal Extragradient-Based Bilinearly-Coupled Saddle-Point Optimization

17 June 2022
S. Du
Gauthier Gidel
Michael I. Jordan
C. J. Li
ArXivPDFHTML
Abstract

We consider the smooth convex-concave bilinearly-coupled saddle-point problem, min⁡xmax⁡y F(x)+H(x,y)−G(y)\min_{\mathbf{x}}\max_{\mathbf{y}}~F(\mathbf{x}) + H(\mathbf{x},\mathbf{y}) - G(\mathbf{y})minx​maxy​ F(x)+H(x,y)−G(y), where one has access to stochastic first-order oracles for FFF, GGG as well as the bilinear coupling function HHH. Building upon standard stochastic extragradient analysis for variational inequalities, we present a stochastic \emph{accelerated gradient-extragradient (AG-EG)} descent-ascent algorithm that combines extragradient and Nesterov's acceleration in general stochastic settings. This algorithm leverages scheduled restarting to admit a fine-grained nonasymptotic convergence rate that matches known lower bounds by both \citet{ibrahim2020linear} and \citet{zhang2021lower} in their corresponding settings, plus an additional statistical error term for bounded stochastic noise that is optimal up to a constant prefactor. This is the first result that achieves such a relatively mature characterization of optimality in saddle-point optimization.

View on arXiv
Comments on this paper