ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.03313
  4. Cited By
Distributed Methods with Compressed Communication for Solving
  Variational Inequalities, with Theoretical Guarantees

Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees

7 October 2021
Aleksandr Beznosikov
Peter Richtárik
Michael Diskin
Max Ryabinin
Alexander Gasnikov
    FedML
ArXivPDFHTML

Papers citing "Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees"

10 / 10 papers shown
Title
Communication Compression for Byzantine Robust Learning: New Efficient
  Algorithms and Improved Rates
Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates
Ahmad Rammal
Kaja Gruntkowska
Nikita Fedin
Eduard A. Gorbunov
Peter Richtárik
40
5
0
15 Oct 2023
Distributed Extra-gradient with Optimal Complexity and Communication
  Guarantees
Distributed Extra-gradient with Optimal Complexity and Communication Guarantees
Ali Ramezani-Kebrya
Kimon Antonakopoulos
Igor Krawczuk
Justin Deschenaux
V. Cevher
36
2
0
17 Aug 2023
Similarity, Compression and Local Steps: Three Pillars of Efficient
  Communications for Distributed Variational Inequalities
Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities
Aleksandr Beznosikov
Martin Takáč
Alexander Gasnikov
29
10
0
15 Feb 2023
Federated Minimax Optimization: Improved Convergence Analyses and
  Algorithms
Federated Minimax Optimization: Improved Convergence Analyses and Algorithms
Pranay Sharma
Rohan Panda
Gauri Joshi
P. Varshney
FedML
21
47
0
09 Mar 2022
Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient
  Methods
Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods
Aleksandr Beznosikov
Eduard A. Gorbunov
Hugo Berard
Nicolas Loizou
19
47
0
15 Feb 2022
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern
  Error Feedback
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern Error Feedback
Ilyas Fatkhullin
Igor Sokolov
Eduard A. Gorbunov
Zhize Li
Peter Richtárik
46
44
0
07 Oct 2021
On the One-sided Convergence of Adam-type Algorithms in Non-convex
  Non-concave Min-max Optimization
On the One-sided Convergence of Adam-type Algorithms in Non-convex Non-concave Min-max Optimization
Zehao Dou
Yuanzhi Li
31
13
0
29 Sep 2021
Stochastic Variance Reduction for Variational Inequality Methods
Stochastic Variance Reduction for Variational Inequality Methods
Ahmet Alacaoglu
Yura Malitsky
58
68
0
16 Feb 2021
FreeLB: Enhanced Adversarial Training for Natural Language Understanding
FreeLB: Enhanced Adversarial Training for Natural Language Understanding
Chen Zhu
Yu Cheng
Zhe Gan
S. Sun
Tom Goldstein
Jingjing Liu
AAML
223
438
0
25 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
1