Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1902.11259
Cited By
Distributed Learning with Sublinear Communication
28 February 2019
Jayadev Acharya
Christopher De Sa
Dylan J. Foster
Karthik Sridharan
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distributed Learning with Sublinear Communication"
6 / 6 papers shown
Title
Fundamental limits of over-the-air optimization: Are analog schemes optimal?
Shubham K. Jha
Prathamesh Mayekar
Himanshu Tyagi
22
7
0
11 Sep 2021
Coded Gradient Aggregation: A Tradeoff Between Communication Costs at Edge Nodes and at Helper Nodes
B. Sasidharan
Anoop Thomas
23
9
0
06 May 2021
On the Utility of Gradient Compression in Distributed Training Systems
Saurabh Agarwal
Hongyi Wang
Shivaram Venkataraman
Dimitris Papailiopoulos
23
46
0
28 Feb 2021
MixML: A Unified Analysis of Weakly Consistent Parallel Learning
Yucheng Lu
J. Nash
Christopher De Sa
FedML
11
12
0
14 May 2020
Hyper-Sphere Quantization: Communication-Efficient SGD for Federated Learning
XINYAN DAI
Xiao Yan
Kaiwen Zhou
Han Yang
K. K. Ng
James Cheng
Yu Fan
FedML
6
47
0
12 Nov 2019
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
168
683
0
07 Dec 2010
1