ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.11259
  4. Cited By
Distributed Learning with Sublinear Communication

Distributed Learning with Sublinear Communication

28 February 2019
Jayadev Acharya
Christopher De Sa
Dylan J. Foster
Karthik Sridharan
    FedML
ArXivPDFHTML

Papers citing "Distributed Learning with Sublinear Communication"

6 / 6 papers shown
Title
Fundamental limits of over-the-air optimization: Are analog schemes
  optimal?
Fundamental limits of over-the-air optimization: Are analog schemes optimal?
Shubham K. Jha
Prathamesh Mayekar
Himanshu Tyagi
22
7
0
11 Sep 2021
Coded Gradient Aggregation: A Tradeoff Between Communication Costs at
  Edge Nodes and at Helper Nodes
Coded Gradient Aggregation: A Tradeoff Between Communication Costs at Edge Nodes and at Helper Nodes
B. Sasidharan
Anoop Thomas
23
9
0
06 May 2021
On the Utility of Gradient Compression in Distributed Training Systems
On the Utility of Gradient Compression in Distributed Training Systems
Saurabh Agarwal
Hongyi Wang
Shivaram Venkataraman
Dimitris Papailiopoulos
23
46
0
28 Feb 2021
MixML: A Unified Analysis of Weakly Consistent Parallel Learning
MixML: A Unified Analysis of Weakly Consistent Parallel Learning
Yucheng Lu
J. Nash
Christopher De Sa
FedML
11
12
0
14 May 2020
Hyper-Sphere Quantization: Communication-Efficient SGD for Federated
  Learning
Hyper-Sphere Quantization: Communication-Efficient SGD for Federated Learning
XINYAN DAI
Xiao Yan
Kaiwen Zhou
Han Yang
K. K. Ng
James Cheng
Yu Fan
FedML
6
47
0
12 Nov 2019
Optimal Distributed Online Prediction using Mini-Batches
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
168
683
0
07 Dec 2010
1