ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.00543
24
18

DoCoFL: Downlink Compression for Cross-Device Federated Learning

1 February 2023
Ron Dorfman
S. Vargaftik
Y. Ben-Itzhak
Kfir Y. Levy
    FedML
ArXivPDFHTML
Abstract

Many compression techniques have been proposed to reduce the communication overhead of Federated Learning training procedures. However, these are typically designed for compressing model updates, which are expected to decay throughout training. As a result, such methods are inapplicable to downlink (i.e., from the parameter server to clients) compression in the cross-device setting, where heterogeneous clients may appear only once\textit{may appear only once}may appear only once during training and thus must download the model parameters. Accordingly, we propose DoCoFL\textsf{DoCoFL}DoCoFL -- a new framework for downlink compression in the cross-device setting. Importantly, DoCoFL\textsf{DoCoFL}DoCoFL can be seamlessly combined with many uplink compression schemes, rendering it suitable for bi-directional compression. Through extensive evaluation, we show that DoCoFL\textsf{DoCoFL}DoCoFL offers significant bi-directional bandwidth reduction while achieving competitive accuracy to that of a baseline without any compression.

View on arXiv
Comments on this paper