ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.13041
41
1

Accelerated Stochastic Min-Max Optimization Based on Bias-corrected Momentum

18 June 2024
H. Cai
Sulaiman A. Alghunaim
Ali H.Sayed
ArXivPDFHTML
Abstract

Lower-bound analyses for nonconvex strongly-concave minimax optimization problems have shown that stochastic first-order algorithms require at least O(ε−4)\mathcal{O}(\varepsilon^{-4})O(ε−4) oracle complexity to find an ε\varepsilonε-stationary point. Some works indicate that this complexity can be improved to O(ε−3)\mathcal{O}(\varepsilon^{-3})O(ε−3) when the loss gradient is Lipschitz continuous. The question of achieving enhanced convergence rates under distinct conditions, remains unresolved. In this work, we address this question for optimization problems that are nonconvex in the minimization variable and strongly concave or Polyak-Lojasiewicz (PL) in the maximization variable. We introduce novel bias-corrected momentum algorithms utilizing efficient Hessian-vector products. We establish convergence conditions and demonstrate a lower iteration complexity of O(ε−3)\mathcal{O}(\varepsilon^{-3})O(ε−3) for the proposed algorithms. The effectiveness of the method is validated through applications to robust logistic regression using real-world datasets.

View on arXiv
@article{cai2025_2406.13041,
  title={ Accelerated Stochastic Min-Max Optimization Based on Bias-corrected Momentum },
  author={ Haoyuan Cai and Sulaiman A. Alghunaim and Ali H.Sayed },
  journal={arXiv preprint arXiv:2406.13041},
  year={ 2025 }
}
Comments on this paper