Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2102.12528
Cited By
Preserved central model for faster bidirectional compression in distributed settings
24 February 2021
Constantin Philippenko
Aymeric Dieuleveut
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Preserved central model for faster bidirectional compression in distributed settings"
7 / 7 papers shown
Title
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Laurent Condat
A. Maranjyan
Peter Richtárik
47
4
0
07 Mar 2024
Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates
Ahmad Rammal
Kaja Gruntkowska
Nikita Fedin
Eduard A. Gorbunov
Peter Richtárik
45
5
0
15 Oct 2023
Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression
Yutong He
Xinmeng Huang
Yiming Chen
W. Yin
Kun Yuan
31
7
0
12 May 2023
DoCoFL: Downlink Compression for Cross-Device Federated Learning
Ron Dorfman
S. Vargaftik
Y. Ben-Itzhak
Kfir Y. Levy
FedML
32
18
0
01 Feb 2023
Downlink Compression Improves TopK Sparsification
William Zou
H. Sterck
Jun Liu
21
0
0
30 Sep 2022
Federated Expectation Maximization with heterogeneity mitigation and variance reduction
Aymeric Dieuleveut
G. Fort
Eric Moulines
Geneviève Robin
FedML
31
5
0
03 Nov 2021
Linearly Converging Error Compensated SGD
Eduard A. Gorbunov
D. Kovalev
Dmitry Makarenko
Peter Richtárik
163
77
0
23 Oct 2020
1