ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.06728
  4. Cited By
OD-SGD: One-step Delay Stochastic Gradient Descent for Distributed
  Training

OD-SGD: One-step Delay Stochastic Gradient Descent for Distributed Training

14 May 2020
Yemao Xu
Dezun Dong
Weixia Xu
Xiangke Liao
ArXivPDFHTML

Papers citing "OD-SGD: One-step Delay Stochastic Gradient Descent for Distributed Training"

2 / 2 papers shown
Title
CD-SGD: Distributed Stochastic Gradient Descent with Compression and
  Delay Compensation
CD-SGD: Distributed Stochastic Gradient Descent with Compression and Delay Compensation
Enda Yu
Dezun Dong
Yemao Xu
Shuo Ouyang
Xiangke Liao
16
5
0
21 Jun 2021
Decentralized Online Learning: Take Benefits from Others' Data without
  Sharing Your Own to Track Global Trend
Decentralized Online Learning: Take Benefits from Others' Data without Sharing Your Own to Track Global Trend
Wendi Wu
Zongren Li
Yawei Zhao
Chenkai Yu
P. Zhao
Ji Liu
FedML
18
16
0
29 Jan 2019
1