ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1502.02268
  4. Cited By
SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization

SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization

8 February 2015
Zheng Qu
Peter Richtárik
Martin Takáč
Olivier Fercoq
    ODL
ArXiv (abs)PDFHTML

Papers citing "SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization"

28 / 28 papers shown
Title
Joker: Joint Optimization Framework for Lightweight Kernel Machines
Joker: Joint Optimization Framework for Lightweight Kernel Machines
Junhong Zhang
Zhihui Lai
48
0
0
23 May 2025
Convergence Analysis of Block Coordinate Algorithms with Determinantal
  Sampling
Convergence Analysis of Block Coordinate Algorithms with Determinantal Sampling
Mojmír Mutný
Michal Derezinski
Andreas Krause
100
21
0
25 Oct 2019
Primal Method for ERM with Flexible Mini-batching Schemes and Non-convex
  Losses
Primal Method for ERM with Flexible Mini-batching Schemes and Non-convex Losses
Dominik Csiba
Peter Richtárik
77
23
0
07 Jun 2015
Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting
Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting
Jakub Konecný
Jie Liu
Peter Richtárik
Martin Takáč
ODL
108
273
0
16 Apr 2015
Coordinate Descent with Arbitrary Sampling II: Expected Separable
  Overapproximation
Coordinate Descent with Arbitrary Sampling II: Expected Separable Overapproximation
Zheng Qu
Peter Richtárik
175
83
0
27 Dec 2014
Coordinate Descent with Arbitrary Sampling I: Algorithms and Complexity
Coordinate Descent with Arbitrary Sampling I: Algorithms and Complexity
Zheng Qu
Peter Richtárik
72
130
0
27 Dec 2014
Randomized Dual Coordinate Ascent with Arbitrary Sampling
Randomized Dual Coordinate Ascent with Arbitrary Sampling
Zheng Qu
Peter Richtárik
Tong Zhang
144
58
0
21 Nov 2014
Iterative Hessian sketch: Fast and accurate solution approximation for
  constrained least-squares
Iterative Hessian sketch: Fast and accurate solution approximation for constrained least-squares
Mert Pilanci
Martin J. Wainwright
77
206
0
03 Nov 2014
mS2GD: Mini-Batch Semi-Stochastic Gradient Descent in the Proximal
  Setting
mS2GD: Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting
Jakub Konecný
Jie Liu
Peter Richtárik
Martin Takáč
ODL
71
22
0
17 Oct 2014
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly
  Convex Composite Objectives
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
135
1,828
0
01 Jul 2014
Fast Distributed Coordinate Descent for Non-Strongly Convex Losses
Fast Distributed Coordinate Descent for Non-Strongly Convex Losses
Olivier Fercoq
Zheng Qu
Peter Richtárik
Martin Takáč
88
59
0
21 May 2014
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
158
739
0
19 Mar 2014
Incremental Majorization-Minimization Optimization with Application to
  Large-Scale Machine Learning
Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
Julien Mairal
149
319
0
18 Feb 2014
A Stochastic Quasi-Newton Method for Large-Scale Optimization
A Stochastic Quasi-Newton Method for Large-Scale Optimization
R. Byrd
Samantha Hansen
J. Nocedal
Y. Singer
ODL
114
473
0
27 Jan 2014
Accelerated, Parallel and Proximal Coordinate Descent
Accelerated, Parallel and Proximal Coordinate Descent
Olivier Fercoq
Peter Richtárik
92
372
0
20 Dec 2013
Semi-Stochastic Gradient Descent Methods
Semi-Stochastic Gradient Descent Methods
Jakub Konecný
Peter Richtárik
ODL
126
239
0
05 Dec 2013
Fast large-scale optimization by unifying stochastic gradient and
  quasi-Newton methods
Fast large-scale optimization by unifying stochastic gradient and quasi-Newton methods
Jascha Narain Sohl-Dickstein
Ben Poole
Surya Ganguli
ODL
123
124
0
09 Nov 2013
On Optimal Probabilities in Stochastic Coordinate Descent Methods
On Optimal Probabilities in Stochastic Coordinate Descent Methods
Peter Richtárik
Martin Takáč
95
130
0
13 Oct 2013
Distributed Coordinate Descent Method for Learning with Big Data
Distributed Coordinate Descent Method for Learning with Big Data
Peter Richtárik
Martin Takáč
191
255
0
08 Oct 2013
Minimizing Finite Sums with the Stochastic Average Gradient
Minimizing Finite Sums with the Stochastic Average Gradient
Mark Schmidt
Nicolas Le Roux
Francis R. Bach
324
1,250
0
10 Sep 2013
Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized
  Loss Minimization
Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
ODL
105
463
0
10 Sep 2013
Separable Approximations and Decomposition Methods for the Augmented
  Lagrangian
Separable Approximations and Decomposition Methods for the Augmented Lagrangian
R. Tappenden
Peter Richtárik
Burak Büke
89
41
0
30 Aug 2013
Accelerated Mini-Batch Stochastic Dual Coordinate Ascent
Accelerated Mini-Batch Stochastic Dual Coordinate Ascent
Shai Shalev-Shwartz
Tong Zhang
ODL
112
151
0
12 May 2013
Inexact Coordinate Descent: Complexity and Preconditioning
Inexact Coordinate Descent: Complexity and Preconditioning
R. Tappenden
Peter Richtárik
J. Gondzio
89
100
0
19 Apr 2013
Mini-Batch Primal and Dual Methods for SVMs
Mini-Batch Primal and Dual Methods for SVMs
Martin Takáč
A. Bijral
Peter Richtárik
Nathan Srebro
72
196
0
10 Mar 2013
Parallel Coordinate Descent Methods for Big Data Optimization
Parallel Coordinate Descent Methods for Big Data Optimization
Peter Richtárik
Martin Takáč
127
487
0
04 Dec 2012
Stochastic Dual Coordinate Ascent Methods for Regularized Loss
  Minimization
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
190
1,033
0
10 Sep 2012
Iteration Complexity of Randomized Block-Coordinate Descent Methods for
  Minimizing a Composite Function
Iteration Complexity of Randomized Block-Coordinate Descent Methods for Minimizing a Composite Function
Peter Richtárik
Martin Takáč
98
772
0
14 Jul 2011
1