ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.13222
  4. Cited By
Combating Client Dropout in Federated Learning via Friend Model
  Substitution

Combating Client Dropout in Federated Learning via Friend Model Substitution

26 May 2022
Heqiang Wang
Jie Xu
    FedML
ArXivPDFHTML

Papers citing "Combating Client Dropout in Federated Learning via Friend Model Substitution"

15 / 15 papers shown
Title
FedSoft: Soft Clustered Federated Learning with Proximal Local Updating
FedSoft: Soft Clustered Federated Learning with Proximal Local Updating
Yichen Ruan
Carlee Joe-Wong
FedML
50
94
0
11 Dec 2021
Anarchic Federated Learning
Anarchic Federated Learning
Haibo Yang
Xin Zhang
Prashant Khanduri
Jia Liu
FedML
36
58
0
23 Aug 2021
Node Selection Toward Faster Convergence for Federated Learning on
  Non-IID Data
Node Selection Toward Faster Convergence for Federated Learning on Non-IID Data
Hongda Wu
Ping Wang
FedML
65
139
0
14 May 2021
Achieving Linear Speedup with Partial Worker Participation in Non-IID
  Federated Learning
Achieving Linear Speedup with Partial Worker Participation in Non-IID Federated Learning
Haibo Yang
Minghong Fang
Jia Liu
FedML
62
258
0
27 Jan 2021
Optimal Client Sampling for Federated Learning
Optimal Client Sampling for Federated Learning
Jiajun He
Samuel Horváth
Peter Richtárik
FedML
75
200
0
26 Oct 2020
An Efficient Framework for Clustered Federated Learning
An Efficient Framework for Clustered Federated Learning
Avishek Ghosh
Jichan Chung
Dong Yin
Kannan Ramchandran
FedML
65
857
0
07 Jun 2020
Adaptive Federated Optimization
Adaptive Federated Optimization
Sashank J. Reddi
Zachary B. Charles
Manzil Zaheer
Zachary Garrett
Keith Rush
Jakub Konecný
Sanjiv Kumar
H. B. McMahan
FedML
163
1,431
0
29 Feb 2020
Distributed Non-Convex Optimization with Sublinear Speedup under
  Intermittent Client Availability
Distributed Non-Convex Optimization with Sublinear Speedup under Intermittent Client Availability
Yikai Yan
Chaoyue Niu
Yucheng Ding
Zhenzhe Zheng
Fan Wu
Guihai Chen
Shaojie Tang
Zhihua Wu
FedML
140
38
0
18 Feb 2020
SCAFFOLD: Stochastic Controlled Averaging for Federated Learning
SCAFFOLD: Stochastic Controlled Averaging for Federated Learning
Sai Praneeth Karimireddy
Satyen Kale
M. Mohri
Sashank J. Reddi
Sebastian U. Stich
A. Suresh
FedML
65
346
0
14 Oct 2019
SAFA: a Semi-Asynchronous Protocol for Fast Federated Learning with Low
  Overhead
SAFA: a Semi-Asynchronous Protocol for Fast Federated Learning with Low Overhead
A. Masullo
Ligang He
Toby Perrett
Rui Mao
Carsten Maple
Majid Mirmehdi
66
313
0
03 Oct 2019
Robust and Communication-Efficient Federated Learning from Non-IID Data
Robust and Communication-Efficient Federated Learning from Non-IID Data
Felix Sattler
Simon Wiedemann
K. Müller
Wojciech Samek
FedML
64
1,353
0
07 Mar 2019
Cooperative SGD: A unified Framework for the Design and Analysis of
  Communication-Efficient SGD Algorithms
Cooperative SGD: A unified Framework for the Design and Analysis of Communication-Efficient SGD Algorithms
Jianyu Wang
Gauri Joshi
154
349
0
22 Aug 2018
Parallel Restarted SGD with Faster Convergence and Less Communication:
  Demystifying Why Model Averaging Works for Deep Learning
Parallel Restarted SGD with Faster Convergence and Less Communication: Demystifying Why Model Averaging Works for Deep Learning
Hao Yu
Sen Yang
Shenghuo Zhu
MoMe
FedML
73
604
0
17 Jul 2018
Local SGD Converges Fast and Communicates Little
Local SGD Converges Fast and Communicates Little
Sebastian U. Stich
FedML
164
1,061
0
24 May 2018
Federated Learning: Strategies for Improving Communication Efficiency
Federated Learning: Strategies for Improving Communication Efficiency
Jakub Konecný
H. B. McMahan
Felix X. Yu
Peter Richtárik
A. Suresh
Dave Bacon
FedML
291
4,636
0
18 Oct 2016
1