ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.09081
20
14

FedSkel: Efficient Federated Learning on Heterogeneous Systems with Skeleton Gradients Update

20 August 2021
Junyu Luo
Jianlei Yang
Xucheng Ye
Xin Guo
Weisheng Zhao
    FedML
ArXivPDFHTML
Abstract

Federated learning aims to protect users' privacy while performing data analysis from different participants. However, it is challenging to guarantee the training efficiency on heterogeneous systems due to the various computational capabilities and communication bottlenecks. In this work, we propose FedSkel to enable computation-efficient and communication-efficient federated learning on edge devices by only updating the model's essential parts, named skeleton networks. FedSkel is evaluated on real edge devices with imbalanced datasets. Experimental results show that it could achieve up to 5.52×\times× speedups for CONV layers' back-propagation, 1.82×\times× speedups for the whole training process, and reduce 64.8% communication cost, with negligible accuracy loss.

View on arXiv
Comments on this paper