ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.11198
  4. Cited By
Achieving Linear Speedup in Asynchronous Federated Learning with
  Heterogeneous Clients

Achieving Linear Speedup in Asynchronous Federated Learning with Heterogeneous Clients

17 February 2024
Xiaolu Wang
Zijian Li
Shi Jin
Jun Zhang
    FedML
ArXivPDFHTML

Papers citing "Achieving Linear Speedup in Asynchronous Federated Learning with Heterogeneous Clients"

2 / 2 papers shown
Title
Convergence Analysis of Asynchronous Federated Learning with Gradient Compression for Non-Convex Optimization
Convergence Analysis of Asynchronous Federated Learning with Gradient Compression for Non-Convex Optimization
Diying Yang
Yingwei Hou
Danyang Xiao
Weigang Wu
FedML
39
0
0
28 Apr 2025
DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients
  via Secret Data Sharing
DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients via Secret Data Sharing
Jiawei Shao
Yuchang Sun
Songze Li
Jun Zhang
OOD
27
37
0
06 Oct 2022
1