ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1903.07266
13
111

Distributed stochastic optimization with gradient tracking over strongly-connected networks

18 March 2019
Ran Xin
Anit Kumar Sahu
U. Khan
S. Kar
ArXivPDFHTML
Abstract

In this paper, we study distributed stochastic optimization to minimize a sum of smooth and strongly-convex local cost functions over a network of agents, communicating over a strongly-connected graph. Assuming that each agent has access to a stochastic first-order oracle (SFO\mathcal{SFO}SFO), we propose a novel distributed method, called S\mathcal{S}S-AB\mathcal{AB}AB, where each agent uses an auxiliary variable to asymptotically track the gradient of the global cost in expectation. The S\mathcal{S}S-AB\mathcal{AB}AB algorithm employs row- and column-stochastic weights simultaneously to ensure both consensus and optimality. Since doubly-stochastic weights are not used, S\mathcal{S}S-AB\mathcal{AB}AB is applicable to arbitrary strongly-connected graphs. We show that under a sufficiently small constant step-size, S\mathcal{S}S-AB\mathcal{AB}AB converges linearly (in expected mean-square sense) to a neighborhood of the global minimizer. We present numerical simulations based on real-world data sets to illustrate the theoretical results.

View on arXiv
Comments on this paper