ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.04048
  4. Cited By
Compressed Distributed Gradient Descent: Communication-Efficient
  Consensus over Networks

Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks

10 December 2018
Xin Zhang
Jia Liu
Zhengyuan Zhu
Elizabeth S. Bentley
ArXivPDFHTML

Papers citing "Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks"

2 / 2 papers shown
Title
Quantization for decentralized learning under subspace constraints
Quantization for decentralized learning under subspace constraints
Roula Nassif
Stefan Vlaski
Marco Carpentiero
Vincenzo Matta
Marc Antonini
Ali H. Sayed
30
29
0
16 Sep 2022
Robust and Communication-Efficient Collaborative Learning
Robust and Communication-Efficient Collaborative Learning
Amirhossein Reisizadeh
Hossein Taheri
Aryan Mokhtari
Hamed Hassani
Ramtin Pedarsani
38
89
0
24 Jul 2019
1