ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.08704
  4. Cited By
Optimal algorithms for smooth and strongly convex distributed
  optimization in networks
v1v2 (latest)

Optimal algorithms for smooth and strongly convex distributed optimization in networks

28 February 2017
Kevin Scaman
Francis R. Bach
Sébastien Bubeck
Y. Lee
Laurent Massoulié
ArXiv (abs)PDFHTML

Papers citing "Optimal algorithms for smooth and strongly convex distributed optimization in networks"

8 / 8 papers shown
Title
Ringmaster ASGD: The First Asynchronous SGD with Optimal Time Complexity
Ringmaster ASGD: The First Asynchronous SGD with Optimal Time Complexity
Artavazd Maranjyan
Alexander Tyurin
Peter Richtárik
87
4
0
27 Jan 2025
Dimension-Free Iteration Complexity of Finite Sum Optimization Problems
Dimension-Free Iteration Complexity of Finite Sum Optimization Problems
Yossi Arjevani
Ohad Shamir
43
24
0
30 Jun 2016
A Distributed Newton Method for Large Scale Consensus Optimization
A Distributed Newton Method for Large Scale Consensus Optimization
Rasul Tutunov
Haitham Bou-Ammar
Ali Jadbabaie
73
64
0
21 Jun 2016
On the Iteration Complexity of Oblivious First-Order Optimization
  Algorithms
On the Iteration Complexity of Oblivious First-Order Optimization Algorithms
Yossi Arjevani
Ohad Shamir
55
33
0
11 May 2016
A Decentralized Second-Order Method with Exact Linear Convergence Rate
  for Consensus Optimization
A Decentralized Second-Order Method with Exact Linear Convergence Rate for Consensus Optimization
Aryan Mokhtari
Wei Shi
Qing Ling
Alejandro Ribeiro
31
124
0
01 Feb 2016
Communication Complexity of Distributed Convex Learning and Optimization
Communication Complexity of Distributed Convex Learning and Optimization
Yossi Arjevani
Ohad Shamir
91
207
0
05 Jun 2015
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly
  Convex Composite Objectives
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
133
1,826
0
01 Jul 2014
Fundamental Limits of Online and Distributed Algorithms for Statistical
  Learning and Estimation
Fundamental Limits of Online and Distributed Algorithms for Statistical Learning and Estimation
Ohad Shamir
91
108
0
14 Nov 2013
1