Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1508.05003
Cited By
AdaDelay: Delay Adaptive Distributed Stochastic Convex Optimization
20 August 2015
S. Sra
Adams Wei Yu
Mu Li
Alex Smola
ODL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"AdaDelay: Delay Adaptive Distributed Stochastic Convex Optimization"
8 / 8 papers shown
Title
Consistent Lock-free Parallel Stochastic Gradient Descent for Fast and Stable Convergence
Karl Bäckström
Ivan Walulya
Marina Papatriantafilou
P. Tsigas
29
5
0
17 Feb 2021
Stochastic bandits with arm-dependent delays
Anne Gael Manegueu
Claire Vernade
Alexandra Carpentier
Michal Valko
29
44
0
18 Jun 2020
Bandwidth Reduction using Importance Weighted Pruning on Ring AllReduce
Zehua Cheng
Zhenghua Xu
22
8
0
06 Jan 2019
POLO: a POLicy-based Optimization library
Arda Aytekin
Martin Biel
M. Johansson
25
3
0
08 Oct 2018
A Tight Convergence Analysis for Stochastic Gradient Descent with Delayed Updates
Yossi Arjevani
Ohad Shamir
Nathan Srebro
14
63
0
26 Jun 2018
Taming Convergence for Asynchronous Stochastic Gradient Descent with Unbounded Delay in Non-Convex Learning
Xin Zhang
Jia-Wei Liu
Zhengyuan Zhu
16
17
0
24 May 2018
Asynchronous Stochastic Gradient Descent with Delay Compensation
Shuxin Zheng
Qi Meng
Taifeng Wang
Wei Chen
Nenghai Yu
Zhiming Ma
Tie-Yan Liu
32
312
0
27 Sep 2016
On Unbounded Delays in Asynchronous Parallel Fixed-Point Algorithms
Robert Hannah
W. Yin
20
53
0
15 Sep 2016
1