ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.02953
  4. Cited By
Sampling without Replacement Leads to Faster Rates in Finite-Sum Minimax
  Optimization

Sampling without Replacement Leads to Faster Rates in Finite-Sum Minimax Optimization

7 June 2022
Aniket Das
Bernhard Schölkopf
Michael Muehlebach
ArXivPDFHTML

Papers citing "Sampling without Replacement Leads to Faster Rates in Finite-Sum Minimax Optimization"

4 / 4 papers shown
Title
Central Limit Theorem for Two-Timescale Stochastic Approximation with
  Markovian Noise: Theory and Applications
Central Limit Theorem for Two-Timescale Stochastic Approximation with Markovian Noise: Theory and Applications
Jie Hu
Vishwaraj Doshi
Do Young Eun
66
4
0
17 Jan 2024
Provably Fast Finite Particle Variants of SVGD via Virtual Particle
  Stochastic Approximation
Provably Fast Finite Particle Variants of SVGD via Virtual Particle Stochastic Approximation
Aniket Das
Dheeraj M. Nagaraj
54
7
0
27 May 2023
SGDA with shuffling: faster convergence for nonconvex-PŁ minimax
  optimization
SGDA with shuffling: faster convergence for nonconvex-PŁ minimax optimization
Hanseul Cho
Chulhee Yun
37
9
0
12 Oct 2022
Manipulating SGD with Data Ordering Attacks
Manipulating SGD with Data Ordering Attacks
Ilia Shumailov
Zakhar Shumaylov
Dmitry Kazhdan
Yiren Zhao
Nicolas Papernot
Murat A. Erdogdu
Ross J. Anderson
AAML
112
91
0
19 Apr 2021
1