ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1612.04003
  4. Cited By
Avoiding communication in primal and dual block coordinate descent
  methods
v1v2 (latest)

Avoiding communication in primal and dual block coordinate descent methods

13 December 2016
Aditya Devarakonda
Kimon Fountoulakis
J. Demmel
Michael W. Mahoney
ArXiv (abs)PDFHTML

Papers citing "Avoiding communication in primal and dual block coordinate descent methods"

6 / 6 papers shown
Title
Distributed Mini-Batch SDCA
Distributed Mini-Batch SDCA
Martin Takáč
Peter Richtárik
Nathan Srebro
64
50
0
29 Jul 2015
SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization
SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization
Zheng Qu
Peter Richtárik
Martin Takáč
Olivier Fercoq
ODL
82
99
0
08 Feb 2015
Communication-Efficient Distributed Dual Coordinate Ascent
Communication-Efficient Distributed Dual Coordinate Ascent
Martin Jaggi
Virginia Smith
Martin Takáč
Jonathan Terhorst
S. Krishnan
Thomas Hofmann
Michael I. Jordan
96
353
0
04 Sep 2014
Stochastic Dual Coordinate Ascent Methods for Regularized Loss
  Minimization
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
187
1,033
0
10 Sep 2012
Iteration Complexity of Randomized Block-Coordinate Descent Methods for
  Minimizing a Composite Function
Iteration Complexity of Randomized Block-Coordinate Descent Methods for Minimizing a Composite Function
Peter Richtárik
Martin Takáč
98
772
0
14 Jul 2011
HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient
  Descent
HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
Feng Niu
Benjamin Recht
Christopher Ré
Stephen J. Wright
201
2,274
0
28 Jun 2011
1