ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1507.07595
  4. Cited By
Distributed Stochastic Variance Reduced Gradient Methods and A Lower
  Bound for Communication Complexity

Distributed Stochastic Variance Reduced Gradient Methods and A Lower Bound for Communication Complexity

27 July 2015
J. Lee
Qihang Lin
Tengyu Ma
Tianbao Yang
    FedML
ArXivPDFHTML

Papers citing "Distributed Stochastic Variance Reduced Gradient Methods and A Lower Bound for Communication Complexity"

6 / 6 papers shown
Title
Optimising cost vs accuracy of decentralised analytics in fog computing
  environments
Optimising cost vs accuracy of decentralised analytics in fog computing environments
Lorenzo Valerio
A. Passarella
M. Conti
35
1
0
09 Dec 2020
Federated Optimization: Distributed Machine Learning for On-Device
  Intelligence
Federated Optimization: Distributed Machine Learning for On-Device Intelligence
Jakub Konecný
H. B. McMahan
Daniel Ramage
Peter Richtárik
FedML
69
1,878
0
08 Oct 2016
Less than a Single Pass: Stochastically Controlled Stochastic Gradient
  Method
Less than a Single Pass: Stochastically Controlled Stochastic Gradient Method
Lihua Lei
Michael I. Jordan
29
96
0
12 Sep 2016
Trading-off variance and complexity in stochastic gradient descent
Trading-off variance and complexity in stochastic gradient descent
Vatsal Shah
Megasthenis Asteris
Anastasios Kyrillidis
Sujay Sanghavi
25
13
0
22 Mar 2016
Communication Complexity of Distributed Convex Learning and Optimization
Communication Complexity of Distributed Convex Learning and Optimization
Yossi Arjevani
Ohad Shamir
39
207
0
05 Jun 2015
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
93
737
0
19 Mar 2014
1