ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.05791
59
0

Exploiting Similarity for Computation and Communication-Efficient Decentralized Optimization

6 June 2025
Yuki Takezawa
Xiaowen Jiang
Anton Rodomanov
Sebastian U. Stich
ArXiv (abs)PDFHTML
Main:8 Pages
6 Figures
Bibliography:4 Pages
5 Tables
Appendix:35 Pages
Abstract

Reducing communication complexity is critical for efficient decentralized optimization. The proximal decentralized optimization (PDO) framework is particularly appealing, as methods within this framework can exploit functional similarity among nodes to reduce communication rounds. Specifically, when local functions at different nodes are similar, these methods achieve faster convergence with fewer communication steps. However, existing PDO methods often require highly accurate solutions to subproblems associated with the proximal operator, resulting in significant computational overhead. In this work, we propose the Stabilized Proximal Decentralized Optimization (SPDO) method, which achieves state-of-the-art communication and computational complexities within the PDO framework. Additionally, we refine the analysis of existing PDO methods by relaxing subproblem accuracy requirements and leveraging average functional similarity. Experimental results demonstrate that SPDO significantly outperforms existing methods.

View on arXiv
@article{takezawa2025_2506.05791,
  title={ Exploiting Similarity for Computation and Communication-Efficient Decentralized Optimization },
  author={ Yuki Takezawa and Xiaowen Jiang and Anton Rodomanov and Sebastian U. Stich },
  journal={arXiv preprint arXiv:2506.05791},
  year={ 2025 }
}
Comments on this paper