ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1412.8063
  4. Cited By
Coordinate Descent with Arbitrary Sampling II: Expected Separable
  Overapproximation

Coordinate Descent with Arbitrary Sampling II: Expected Separable Overapproximation

27 December 2014
Zheng Qu
Peter Richtárik
ArXivPDFHTML

Papers citing "Coordinate Descent with Arbitrary Sampling II: Expected Separable Overapproximation"

9 / 9 papers shown
Title
Laplacian-based Semi-Supervised Learning in Multilayer Hypergraphs by
  Coordinate Descent
Laplacian-based Semi-Supervised Learning in Multilayer Hypergraphs by Coordinate Descent
Sara Venturini
Andrea Cristofari
Francesco Rinaldi
Francesco Tudisco
39
2
0
28 Jan 2023
GradSkip: Communication-Accelerated Local Gradient Methods with Better
  Computational Complexity
GradSkip: Communication-Accelerated Local Gradient Methods with Better Computational Complexity
A. Maranjyan
M. Safaryan
Peter Richtárik
34
13
0
28 Oct 2022
Optimization for Supervised Machine Learning: Randomized Algorithms for
  Data and Parameters
Optimization for Supervised Machine Learning: Randomized Algorithms for Data and Parameters
Filip Hanzely
32
0
0
26 Aug 2020
Precise expressions for random projections: Low-rank approximation and
  randomized Newton
Precise expressions for random projections: Low-rank approximation and randomized Newton
Michal Derezinski
Feynman T. Liang
Zhenyu A. Liao
Michael W. Mahoney
34
23
0
18 Jun 2020
Convergence Analysis of Block Coordinate Algorithms with Determinantal
  Sampling
Convergence Analysis of Block Coordinate Algorithms with Determinantal Sampling
Mojmír Mutný
Michal Derezinski
Andreas Krause
35
20
0
25 Oct 2019
99% of Distributed Optimization is a Waste of Time: The Issue and How to
  Fix it
99% of Distributed Optimization is a Waste of Time: The Issue and How to Fix it
Konstantin Mishchenko
Filip Hanzely
Peter Richtárik
16
13
0
27 Jan 2019
SEGA: Variance Reduction via Gradient Sketching
SEGA: Variance Reduction via Gradient Sketching
Filip Hanzely
Konstantin Mishchenko
Peter Richtárik
25
71
0
09 Sep 2018
Momentum and Stochastic Momentum for Stochastic Gradient, Newton,
  Proximal Point and Subspace Descent Methods
Momentum and Stochastic Momentum for Stochastic Gradient, Newton, Proximal Point and Subspace Descent Methods
Nicolas Loizou
Peter Richtárik
19
199
0
27 Dec 2017
Stochastic Dual Coordinate Ascent with Adaptive Probabilities
Stochastic Dual Coordinate Ascent with Adaptive Probabilities
Dominik Csiba
Zheng Qu
Peter Richtárik
ODL
61
97
0
27 Feb 2015
1