ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.02702
  4. Cited By
A Sharp Estimate on the Transient Time of Distributed Stochastic
  Gradient Descent

A Sharp Estimate on the Transient Time of Distributed Stochastic Gradient Descent

6 June 2019
Shi Pu
Alexander Olshevsky
I. Paschalidis
ArXivPDFHTML

Papers citing "A Sharp Estimate on the Transient Time of Distributed Stochastic Gradient Descent"

6 / 6 papers shown
Title
A Unified Theory of Decentralized SGD with Changing Topology and Local
  Updates
A Unified Theory of Decentralized SGD with Changing Topology and Local Updates
Anastasia Koloskova
Nicolas Loizou
Sadra Boreiri
Martin Jaggi
Sebastian U. Stich
FedML
46
493
0
23 Mar 2020
Decentralized gradient methods: does topology matter?
Decentralized gradient methods: does topology matter?
Giovanni Neglia
Chuan Xu
Don Towsley
G. Calbi
24
50
0
28 Feb 2020
Gradient tracking and variance reduction for decentralized optimization
  and machine learning
Gradient tracking and variance reduction for decentralized optimization and machine learning
Ran Xin
S. Kar
U. Khan
19
10
0
13 Feb 2020
Robust Distributed Accelerated Stochastic Gradient Methods for
  Multi-Agent Networks
Robust Distributed Accelerated Stochastic Gradient Methods for Multi-Agent Networks
Alireza Fallah
Mert Gurbuzbalaban
Asuman Ozdaglar
Umut Simsekli
Lingjiong Zhu
37
28
0
19 Oct 2019
Asymptotic Network Independence in Distributed Stochastic Optimization
  for Machine Learning
Asymptotic Network Independence in Distributed Stochastic Optimization for Machine Learning
Shi Pu
Alexander Olshevsky
I. Paschalidis
31
41
0
28 Jun 2019
Swarming for Faster Convergence in Stochastic Optimization
Swarming for Faster Convergence in Stochastic Optimization
Shi Pu
Alfredo García
37
16
0
11 Jun 2018
1