ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.09908
  4. Cited By
Zeroth-Order Algorithms for Smooth Saddle-Point Problems

Zeroth-Order Algorithms for Smooth Saddle-Point Problems

21 September 2020
Abdurakhmon Sadiev
Aleksandr Beznosikov
Pavel Dvurechensky
Alexander Gasnikov
    ODL
ArXivPDFHTML

Papers citing "Zeroth-Order Algorithms for Smooth Saddle-Point Problems"

2 / 2 papers shown
Title
Primal Dual Alternating Proximal Gradient Algorithms for Nonsmooth
  Nonconvex Minimax Problems with Coupled Linear Constraints
Primal Dual Alternating Proximal Gradient Algorithms for Nonsmooth Nonconvex Minimax Problems with Coupled Linear Constraints
Hui-Li Zhang
Junlin Wang
Zi Xu
Y. Dai
93
4
0
09 Dec 2022
Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient
  Methods
Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods
Aleksandr Beznosikov
Eduard A. Gorbunov
Hugo Berard
Nicolas Loizou
24
49
0
15 Feb 2022
1