ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1502.06177
  4. Cited By
SDCA without Duality

SDCA without Duality

22 February 2015
Shai Shalev-Shwartz
ArXivPDFHTML

Papers citing "SDCA without Duality"

7 / 7 papers shown
Title
Riemannian stochastic variance reduced gradient algorithm with
  retraction and vector transport
Riemannian stochastic variance reduced gradient algorithm with retraction and vector transport
Hiroyuki Sato
Hiroyuki Kasai
Bamdev Mishra
117
58
0
18 Feb 2017
Importance Sampling for Minibatches
Importance Sampling for Minibatches
Dominik Csiba
Peter Richtárik
90
115
0
06 Feb 2016
Finito: A Faster, Permutable Incremental Gradient Method for Big Data
  Problems
Finito: A Faster, Permutable Incremental Gradient Method for Big Data Problems
Aaron Defazio
T. Caetano
Justin Domke
102
169
0
10 Jul 2014
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly
  Convex Composite Objectives
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
128
1,823
0
01 Jul 2014
Semi-Stochastic Gradient Descent Methods
Semi-Stochastic Gradient Descent Methods
Jakub Konecný
Peter Richtárik
ODL
110
238
0
05 Dec 2013
Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized
  Loss Minimization
Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
ODL
93
463
0
10 Sep 2013
Stochastic Dual Coordinate Ascent Methods for Regularized Loss
  Minimization
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
155
1,032
0
10 Sep 2012
1