Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2212.14370
Cited By
Can 5th Generation Local Training Methods Support Client Sampling? Yes!
29 December 2022
Michal Grudzieñ
Grigory Malinovsky
Peter Richtárik
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Can 5th Generation Local Training Methods Support Client Sampling? Yes!"
9 / 9 papers shown
Title
A-FedPD: Aligning Dual-Drift is All Federated Primal-Dual Learning Needs
Yan Sun
Li Shen
Dacheng Tao
FedML
25
0
0
27 Sep 2024
Understanding Server-Assisted Federated Learning in the Presence of Incomplete Client Participation
Haibo Yang
Pei-Yuan Qiu
Prashant Khanduri
Minghong Fang
Jia Liu
FedML
40
1
0
04 May 2024
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Laurent Condat
A. Maranjyan
Peter Richtárik
47
4
0
07 Mar 2024
DualFL: A Duality-based Federated Learning Algorithm with Communication Acceleration in the General Convex Regime
Jongho Park
Jinchao Xu
FedML
60
1
0
17 May 2023
Distributed Stochastic Optimization under a General Variance Condition
Kun-Yen Huang
Xiao Li
Shin-Yi Pu
FedML
40
6
0
30 Jan 2023
An Optimal Algorithm for Strongly Convex Min-min Optimization
Alexander Gasnikov
D. Kovalev
Grigory Malinovsky
32
1
0
29 Dec 2022
GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity
A. Maranjyan
M. Safaryan
Peter Richtárik
34
13
0
28 Oct 2022
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
187
412
0
14 Jul 2021
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
FedML
55
157
0
14 Feb 2021
1