ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.01197
  4. Cited By
Nested Dithered Quantization for Communication Reduction in Distributed
  Training

Nested Dithered Quantization for Communication Reduction in Distributed Training

2 April 2019
Afshin Abdi
Faramarz Fekri
    MQ
ArXivPDFHTML

Papers citing "Nested Dithered Quantization for Communication Reduction in Distributed Training"

4 / 4 papers shown
Title
Remote Inference over Dynamic Links via Adaptive Rate Deep Task-Oriented Vector Quantization
Remote Inference over Dynamic Links via Adaptive Rate Deep Task-Oriented Vector Quantization
Eyal Fishel
M. Malka
Shai Ginzach
Nir Shlezinger
49
0
0
07 Jan 2025
Compressed Private Aggregation for Scalable and Robust Federated Learning over Massive Networks
Compressed Private Aggregation for Scalable and Robust Federated Learning over Massive Networks
Natalie Lang
Nir Shlezinger
Rafael G. L. DÓliveira
S. E. Rouayheb
FedML
82
4
0
01 Aug 2023
DISCO: Distributed Inference with Sparse Communications
DISCO: Distributed Inference with Sparse Communications
Minghai Qin
Chaowen Sun
Jaco A. Hofmann
D. Vučinić
FedML
29
1
0
22 Feb 2023
Learned Gradient Compression for Distributed Deep Learning
Learned Gradient Compression for Distributed Deep Learning
L. Abrahamyan
Yiming Chen
Giannis Bekoulis
Nikos Deligiannis
40
46
0
16 Mar 2021
1