In this paper, we study distributed stochastic optimization to minimize a sum of smooth and strongly-convex local cost functions over a network of agents, communicating over a strongly-connected graph. Assuming that each agent has access to a stochastic first-order oracle (), we propose a novel distributed method, called -, where each agent uses an auxiliary variable to asymptotically track the gradient of the global cost in expectation. The - algorithm employs row- and column-stochastic weights simultaneously to ensure both consensus and optimality. Since doubly-stochastic weights are not used, - is applicable to arbitrary strongly-connected graphs. We show that under a sufficiently small constant step-size, - converges linearly (in expected mean-square sense) to a neighborhood of the global minimizer. We present numerical simulations based on real-world data sets to illustrate the theoretical results.
View on arXiv