ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.10322
14
0

Asynchronous Decentralized SGD under Non-Convexity: A Block-Coordinate Descent Framework

15 May 2025
Yijie Zhou
Shi Pu
ArXivPDFHTML
Abstract

Decentralized optimization has become vital for leveraging distributed data without central control, enhancing scalability and privacy. However, practical deployments face fundamental challenges due to heterogeneous computation speeds and unpredictable communication delays. This paper introduces a refined model of Asynchronous Decentralized Stochastic Gradient Descent (ADSGD) under practical assumptions of bounded computation and communication times. To understand the convergence of ADSGD, we first analyze Asynchronous Stochastic Block Coordinate Descent (ASBCD) as a tool, and then show that ADSGD converges under computation-delay-independent step sizes. The convergence result is established without assuming bounded data heterogeneity. Empirical experiments reveal that ADSGD outperforms existing methods in wall-clock convergence time across various scenarios. With its simplicity, efficiency in memory and communication, and resilience to communication and computation delays, ADSGD is well-suited for real-world decentralized learning tasks.

View on arXiv
@article{zhou2025_2505.10322,
  title={ Asynchronous Decentralized SGD under Non-Convexity: A Block-Coordinate Descent Framework },
  author={ Yijie Zhou and Shi Pu },
  journal={arXiv preprint arXiv:2505.10322},
  year={ 2025 }
}
Comments on this paper