Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2204.13169
Cited By
FedShuffle: Recipes for Better Use of Local Work in Federated Learning
27 April 2022
Samuel Horváth
Maziar Sanjabi
Lin Xiao
Peter Richtárik
Michael G. Rabbat
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FedShuffle: Recipes for Better Use of Local Work in Federated Learning"
6 / 6 papers shown
Title
A-FedPD: Aligning Dual-Drift is All Federated Primal-Dual Learning Needs
Yan Sun
Li Shen
Dacheng Tao
FedML
25
0
0
27 Sep 2024
GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity
A. Maranjyan
M. Safaryan
Peter Richtárik
34
13
0
28 Oct 2022
Federated Optimization Algorithms with Random Reshuffling and Gradient Compression
Abdurakhmon Sadiev
Grigory Malinovsky
Eduard A. Gorbunov
Igor Sokolov
Ahmed Khaled
Konstantin Burlachenko
Peter Richtárik
FedML
16
21
0
14 Jun 2022
Straggler-Resilient Personalized Federated Learning
Isidoros Tziotis
Zebang Shen
Ramtin Pedarsani
Hamed Hassani
Aryan Mokhtari
FedML
31
9
0
05 Jun 2022
Papaya: Practical, Private, and Scalable Federated Learning
Dzmitry Huba
John Nguyen
Kshitiz Malik
Ruiyu Zhu
Michael G. Rabbat
...
H. Srinivas
Kaikai Wang
Anthony Shoumikhin
Jesik Min
Mani Malek
FedML
113
137
0
08 Nov 2021
FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout
Samuel Horváth
Stefanos Laskaridis
Mario Almeida
Ilias Leondiadis
Stylianos I. Venieris
Nicholas D. Lane
189
268
0
26 Feb 2021
1