ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.03871
  4. Cited By
Distributed Learning with Sparse Communications by Identification

Distributed Learning with Sparse Communications by Identification

10 December 2018
Dmitry Grishchenko
F. Iutzeler
J. Malick
Massih-Reza Amini
ArXivPDFHTML

Papers citing "Distributed Learning with Sparse Communications by Identification"

4 / 4 papers shown
Title
Asynchronous Distributed Optimization with Redundancy in Cost Functions
Asynchronous Distributed Optimization with Redundancy in Cost Functions
Shuo Liu
Nirupam Gupta
Nitin H. Vaidya
24
3
0
07 Jun 2021
On the Utility of Gradient Compression in Distributed Training Systems
On the Utility of Gradient Compression in Distributed Training Systems
Saurabh Agarwal
Hongyi Wang
Shivaram Venkataraman
Dimitris Papailiopoulos
31
46
0
28 Feb 2021
Optimization for Supervised Machine Learning: Randomized Algorithms for
  Data and Parameters
Optimization for Supervised Machine Learning: Randomized Algorithms for Data and Parameters
Filip Hanzely
34
0
0
26 Aug 2020
Natural Compression for Distributed Deep Learning
Natural Compression for Distributed Deep Learning
Samuel Horváth
Chen-Yu Ho
L. Horvath
Atal Narayan Sahu
Marco Canini
Peter Richtárik
21
151
0
27 May 2019
1