ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.16926
  4. Cited By
OSP: Boosting Distributed Model Training with 2-stage Synchronization
v1v2 (latest)

OSP: Boosting Distributed Model Training with 2-stage Synchronization

29 June 2023
Zixuan Chen
Lei Shi
Xuandong Liu
Jiahui Li
Sen Liu
Yang Xu
ArXiv (abs)PDFHTML

Papers citing "OSP: Boosting Distributed Model Training with 2-stage Synchronization"

1 / 1 papers shown
Title
Communication-Efficient Large-Scale Distributed Deep Learning: A
  Comprehensive Survey
Communication-Efficient Large-Scale Distributed Deep Learning: A Comprehensive Survey
Feng Liang
Zhen Zhang
Haifeng Lu
Victor C. M. Leung
Yanyi Guo
Xiping Hu
GNN
103
8
0
09 Apr 2024
1