ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.00340
14
501

Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication

1 February 2019
Anastasia Koloskova
Sebastian U. Stich
Martin Jaggi
    FedML
ArXivPDFHTML
Abstract

We consider decentralized stochastic optimization with the objective function (e.g. data samples for machine learning task) being distributed over nnn machines that can only communicate to their neighbors on a fixed communication graph. To reduce the communication bottleneck, the nodes compress (e.g. quantize or sparsify) their model updates. We cover both unbiased and biased compression operators with quality denoted by ω≤1\omega \leq 1ω≤1 (ω=1\omega=1ω=1 meaning no compression). We (i) propose a novel gossip-based stochastic gradient descent algorithm, CHOCO-SGD, that converges at rate O(1/(nT)+1/(Tδ2ω)2)\mathcal{O}\left(1/(nT) + 1/(T \delta^2 \omega)^2\right)O(1/(nT)+1/(Tδ2ω)2) for strongly convex objectives, where TTT denotes the number of iterations and δ\deltaδ the eigengap of the connectivity matrix. Despite compression quality and network connectivity affecting the higher order terms, the first term in the rate, O(1/(nT))\mathcal{O}(1/(nT))O(1/(nT)), is the same as for the centralized baseline with exact communication. We (ii) present a novel gossip algorithm, CHOCO-GOSSIP, for the average consensus problem that converges in time O(1/(δ2ω)log⁡(1/ϵ))\mathcal{O}(1/(\delta^2\omega) \log (1/\epsilon))O(1/(δ2ω)log(1/ϵ)) for accuracy ϵ>0\epsilon > 0ϵ>0. This is (up to our knowledge) the first gossip algorithm that supports arbitrary compressed messages for ω>0\omega > 0ω>0 and still exhibits linear convergence. We (iii) show in experiments that both of our algorithms do outperform the respective state-of-the-art baselines and CHOCO-SGD can reduce communication by at least two orders of magnitudes.

View on arXiv
Comments on this paper