ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.07848
  4. Cited By

Gradient Scheduling with Global Momentum for Non-IID Data Distributed Asynchronous Training

21 February 2019
Chengjie Li
Ruixuan Li
Yining Qi
Yuhua Li
Pan Zhou
Song Guo
Keqin Li
ArXivPDFHTML

Papers citing "Gradient Scheduling with Global Momentum for Non-IID Data Distributed Asynchronous Training"

5 / 5 papers shown
Title
Improving Federated Learning Communication Efficiency with Global
  Momentum Fusion for Gradient Compression Schemes
Improving Federated Learning Communication Efficiency with Global Momentum Fusion for Gradient Compression Schemes
Chun-Chih Kuo
Ted T. Kuo
Chia-Yu Lin
FedML
13
1
0
17 Nov 2022
AsyncFedED: Asynchronous Federated Learning with Euclidean Distance
  based Adaptive Weight Aggregation
AsyncFedED: Asynchronous Federated Learning with Euclidean Distance based Adaptive Weight Aggregation
Qiyuan Wang
Qianqian Yang
Shibo He
Zhiguo Shi
Jiming Chen
FedML
37
25
0
27 May 2022
Towards Efficient and Stable K-Asynchronous Federated Learning with
  Unbounded Stale Gradients on Non-IID Data
Towards Efficient and Stable K-Asynchronous Federated Learning with Unbounded Stale Gradients on Non-IID Data
Zihao Zhou
Yanan Li
Xuebin Ren
Shusen Yang
22
29
0
02 Mar 2022
Aggregation Delayed Federated Learning
Aggregation Delayed Federated Learning
Ye Xue
Diego Klabjan
Yuan Luo
FedML
OOD
23
5
0
17 Aug 2021
Cross-Gradient Aggregation for Decentralized Learning from Non-IID data
Cross-Gradient Aggregation for Decentralized Learning from Non-IID data
Yasaman Esfandiari
Sin Yong Tan
Zhanhong Jiang
Aditya Balu
Ethan Herron
C. Hegde
S. Sarkar
OOD
17
50
0
02 Mar 2021
1