ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1609.09823
  4. Cited By
On the Worst-case Communication Overhead for Distributed Data Shuffling

On the Worst-case Communication Overhead for Distributed Data Shuffling

30 September 2016
Mohamed Adel Attia
Ravi Tandon
    FedML
ArXivPDFHTML

Papers citing "On the Worst-case Communication Overhead for Distributed Data Shuffling"

3 / 3 papers shown
Title
On the Fundamental Limits of Coded Data Shuffling for Distributed
  Machine Learning
On the Fundamental Limits of Coded Data Shuffling for Distributed Machine Learning
Adel M. Elmahdy
S. Mohajer
FedML
19
15
0
11 Jul 2018
Near Optimal Coded Data Shuffling for Distributed Learning
Near Optimal Coded Data Shuffling for Distributed Learning
Mohamed Adel Attia
Ravi Tandon
FedML
34
30
0
05 Jan 2018
Speeding Up Distributed Machine Learning Using Codes
Speeding Up Distributed Machine Learning Using Codes
Kangwook Lee
Maximilian Lam
Ramtin Pedarsani
Dimitris Papailiopoulos
Kannan Ramchandran
51
853
0
08 Dec 2015
1