ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2212.14370
  4. Cited By
Can 5th Generation Local Training Methods Support Client Sampling? Yes!

Can 5th Generation Local Training Methods Support Client Sampling? Yes!

29 December 2022
Michal Grudzieñ
Grigory Malinovsky
Peter Richtárik
ArXivPDFHTML

Papers citing "Can 5th Generation Local Training Methods Support Client Sampling? Yes!"

9 / 9 papers shown
Title
A-FedPD: Aligning Dual-Drift is All Federated Primal-Dual Learning Needs
A-FedPD: Aligning Dual-Drift is All Federated Primal-Dual Learning Needs
Yan Sun
Li Shen
Dacheng Tao
FedML
25
0
0
27 Sep 2024
Understanding Server-Assisted Federated Learning in the Presence of
  Incomplete Client Participation
Understanding Server-Assisted Federated Learning in the Presence of Incomplete Client Participation
Haibo Yang
Pei-Yuan Qiu
Prashant Khanduri
Minghong Fang
Jia Liu
FedML
40
1
0
04 May 2024
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Laurent Condat
A. Maranjyan
Peter Richtárik
47
4
0
07 Mar 2024
DualFL: A Duality-based Federated Learning Algorithm with Communication
  Acceleration in the General Convex Regime
DualFL: A Duality-based Federated Learning Algorithm with Communication Acceleration in the General Convex Regime
Jongho Park
Jinchao Xu
FedML
60
1
0
17 May 2023
Distributed Stochastic Optimization under a General Variance Condition
Distributed Stochastic Optimization under a General Variance Condition
Kun-Yen Huang
Xiao Li
Shin-Yi Pu
FedML
40
6
0
30 Jan 2023
An Optimal Algorithm for Strongly Convex Min-min Optimization
An Optimal Algorithm for Strongly Convex Min-min Optimization
Alexander Gasnikov
D. Kovalev
Grigory Malinovsky
32
1
0
29 Dec 2022
GradSkip: Communication-Accelerated Local Gradient Methods with Better
  Computational Complexity
GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity
A. Maranjyan
M. Safaryan
Peter Richtárik
34
13
0
28 Oct 2022
A Field Guide to Federated Optimization
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
187
412
0
14 Jul 2021
Linear Convergence in Federated Learning: Tackling Client Heterogeneity
  and Sparse Gradients
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
FedML
55
157
0
14 Feb 2021
1