Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1507.07595
Cited By
Distributed Stochastic Variance Reduced Gradient Methods and A Lower Bound for Communication Complexity
27 July 2015
J. Lee
Qihang Lin
Tengyu Ma
Tianbao Yang
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distributed Stochastic Variance Reduced Gradient Methods and A Lower Bound for Communication Complexity"
6 / 6 papers shown
Title
Optimising cost vs accuracy of decentralised analytics in fog computing environments
Lorenzo Valerio
A. Passarella
M. Conti
35
1
0
09 Dec 2020
Federated Optimization: Distributed Machine Learning for On-Device Intelligence
Jakub Konecný
H. B. McMahan
Daniel Ramage
Peter Richtárik
FedML
69
1,878
0
08 Oct 2016
Less than a Single Pass: Stochastically Controlled Stochastic Gradient Method
Lihua Lei
Michael I. Jordan
29
96
0
12 Sep 2016
Trading-off variance and complexity in stochastic gradient descent
Vatsal Shah
Megasthenis Asteris
Anastasios Kyrillidis
Sujay Sanghavi
25
13
0
22 Mar 2016
Communication Complexity of Distributed Convex Learning and Optimization
Yossi Arjevani
Ohad Shamir
39
207
0
05 Jun 2015
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
93
737
0
19 Mar 2014
1