Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1906.02702
Cited By
A Sharp Estimate on the Transient Time of Distributed Stochastic Gradient Descent
6 June 2019
Shi Pu
Alexander Olshevsky
I. Paschalidis
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Sharp Estimate on the Transient Time of Distributed Stochastic Gradient Descent"
6 / 6 papers shown
Title
A Unified Theory of Decentralized SGD with Changing Topology and Local Updates
Anastasia Koloskova
Nicolas Loizou
Sadra Boreiri
Martin Jaggi
Sebastian U. Stich
FedML
46
493
0
23 Mar 2020
Decentralized gradient methods: does topology matter?
Giovanni Neglia
Chuan Xu
Don Towsley
G. Calbi
24
50
0
28 Feb 2020
Gradient tracking and variance reduction for decentralized optimization and machine learning
Ran Xin
S. Kar
U. Khan
19
10
0
13 Feb 2020
Robust Distributed Accelerated Stochastic Gradient Methods for Multi-Agent Networks
Alireza Fallah
Mert Gurbuzbalaban
Asuman Ozdaglar
Umut Simsekli
Lingjiong Zhu
37
28
0
19 Oct 2019
Asymptotic Network Independence in Distributed Stochastic Optimization for Machine Learning
Shi Pu
Alexander Olshevsky
I. Paschalidis
31
41
0
28 Jun 2019
Swarming for Faster Convergence in Stochastic Optimization
Shi Pu
Alfredo García
37
16
0
11 Jun 2018
1