ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.14693
  4. Cited By
Federated learning compression designed for lightweight communications

Federated learning compression designed for lightweight communications

23 October 2023
Lucas Grativol Ribeiro
Mathieu Léonardon
Guillaume Muller
Virginie Fresse
Matthieu Arzel
    FedML
ArXiv (abs)PDFHTML

Papers citing "Federated learning compression designed for lightweight communications"

14 / 14 papers shown
Title
AnycostFL: Efficient On-Demand Federated Learning over Heterogeneous
  Edge Devices
AnycostFL: Efficient On-Demand Federated Learning over Heterogeneous Edge Devices
Peichun Li
Guoliang Cheng
Xumin Huang
Jiawen Kang
Rong Yu
Yuan Wu
Miao Pan
FedML
99
22
0
08 Jan 2023
ZeroFL: Efficient On-Device Training for Federated Learning with Local
  Sparsity
ZeroFL: Efficient On-Device Training for Federated Learning with Local Sparsity
Xinchi Qiu
Javier Fernandez-Marques
Pedro Gusmão
Yan Gao
Titouan Parcollet
Nicholas D. Lane
FedML
69
71
0
04 Aug 2022
On-Device Training Under 256KB Memory
On-Device Training Under 256KB Memory
Ji Lin
Ligeng Zhu
Wei-Ming Chen
Wei-Chen Wang
Chuang Gan
Song Han
MQ
88
207
0
30 Jun 2022
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
314
720
0
31 Jan 2021
Rethinking Weight Decay For Efficient Neural Network Pruning
Rethinking Weight Decay For Efficient Neural Network Pruning
Hugo Tessier
Vincent Gripon
Mathieu Léonardon
M. Arzel
T. Hannagan
David Bertrand
81
26
0
20 Nov 2020
Flower: A Friendly Federated Learning Research Framework
Flower: A Friendly Federated Learning Research Framework
Daniel J. Beutel
Taner Topal
Akhil Mathur
Xinchi Qiu
Javier Fernandez-Marques
...
Lorenzo Sani
Kwing Hei Li
Titouan Parcollet
Pedro Porto Buarque de Gusmão
Nicholas D. Lane
FedML
136
809
0
28 Jul 2020
Adaptive Federated Optimization
Adaptive Federated Optimization
Sashank J. Reddi
Zachary B. Charles
Manzil Zaheer
Zachary Garrett
Keith Rush
Jakub Konecný
Sanjiv Kumar
H. B. McMahan
FedML
177
1,437
0
29 Feb 2020
Sparse Weight Activation Training
Sparse Weight Activation Training
Md Aamir Raihan
Tor M. Aamodt
91
73
0
07 Jan 2020
Advances and Open Problems in Federated Learning
Advances and Open Problems in Federated Learning
Peter Kairouz
H. B. McMahan
Brendan Avent
A. Bellet
M. Bennis
...
Zheng Xu
Qiang Yang
Felix X. Yu
Han Yu
Sen Zhao
FedMLAI4CE
256
6,261
0
10 Dec 2019
Measuring the Effects of Non-Identical Data Distribution for Federated
  Visual Classification
Measuring the Effects of Non-Identical Data Distribution for Federated Visual Classification
T. Hsu
Qi
Matthew Brown
FedML
143
1,150
0
13 Sep 2019
A Survey of Model Compression and Acceleration for Deep Neural Networks
A Survey of Model Compression and Acceleration for Deep Neural Networks
Yu Cheng
Duo Wang
Pan Zhou
Zhang Tao
74
1,095
0
23 Oct 2017
Communication-Efficient Learning of Deep Networks from Decentralized
  Data
Communication-Efficient Learning of Deep Networks from Decentralized Data
H. B. McMahan
Eider Moore
Daniel Ramage
S. Hampson
Blaise Agüera y Arcas
FedML
406
17,486
0
17 Feb 2016
BinaryConnect: Training Deep Neural Networks with binary weights during
  propagations
BinaryConnect: Training Deep Neural Networks with binary weights during propagations
Matthieu Courbariaux
Yoshua Bengio
J. David
MQ
209
2,985
0
02 Nov 2015
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained
  Quantization and Huffman Coding
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding
Song Han
Huizi Mao
W. Dally
3DGS
259
8,842
0
01 Oct 2015
1