ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1609.05181
  4. Cited By
Information Theoretic Limits of Data Shuffling for Distributed Learning

Information Theoretic Limits of Data Shuffling for Distributed Learning

16 September 2016
Mohamed Adel Attia
Ravi Tandon
    FedML
ArXivPDFHTML

Papers citing "Information Theoretic Limits of Data Shuffling for Distributed Learning"

4 / 4 papers shown
Title
Coded Computing for Distributed Graph Analytics
Coded Computing for Distributed Graph Analytics
Saurav Prakash
Amirhossein Reisizadeh
Ramtin Pedarsani
A. Avestimehr
10
50
0
17 Jan 2018
Near Optimal Coded Data Shuffling for Distributed Learning
Near Optimal Coded Data Shuffling for Distributed Learning
Mohamed Adel Attia
Ravi Tandon
FedML
23
30
0
05 Jan 2018
Coded Computation over Heterogeneous Clusters
Coded Computation over Heterogeneous Clusters
Amirhossein Reisizadeh
Saurav Prakash
Ramtin Pedarsani
A. Avestimehr
17
225
0
21 Jan 2017
Speeding Up Distributed Machine Learning Using Codes
Speeding Up Distributed Machine Learning Using Codes
Kangwook Lee
Maximilian Lam
Ramtin Pedarsani
Dimitris Papailiopoulos
Kannan Ramchandran
36
853
0
08 Dec 2015
1