ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.06752
12
39

A Hybrid Variance-Reduced Method for Decentralized Stochastic Non-Convex Optimization

12 February 2021
Ran Xin
U. Khan
S. Kar
ArXivPDFHTML
Abstract

This paper considers decentralized stochastic optimization over a network of nnn nodes, where each node possesses a smooth non-convex local cost function and the goal of the networked nodes is to find an ϵ\epsilonϵ-accurate first-order stationary point of the sum of the local costs. We focus on an online setting, where each node accesses its local cost only by means of a stochastic first-order oracle that returns a noisy version of the exact gradient. In this context, we propose a novel single-loop decentralized hybrid variance-reduced stochastic gradient method, called GT-HSGD, that outperforms the existing approaches in terms of both the oracle complexity and practical implementation. The GT-HSGD algorithm implements specialized local hybrid stochastic gradient estimators that are fused over the network to track the global gradient. Remarkably, GT-HSGD achieves a network topology-independent oracle complexity of O(n−1ϵ−3)O(n^{-1}\epsilon^{-3})O(n−1ϵ−3) when the required error tolerance ϵ\epsilonϵ is small enough, leading to a linear speedup with respect to the centralized optimal online variance-reduced approaches that operate on a single node. Numerical experiments are provided to illustrate our main technical results.

View on arXiv
Comments on this paper