Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1706.07001
Cited By
Improved Optimization of Finite Sums with Minibatch Stochastic Variance Reduced Proximal Iterations
21 June 2017
Jialei Wang
Tong Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improved Optimization of Finite Sums with Minibatch Stochastic Variance Reduced Proximal Iterations"
9 / 9 papers shown
Title
Oracle Complexity of Second-Order Methods for Finite-Sum Problems
Yossi Arjevani
Ohad Shamir
69
24
0
15 Nov 2016
Sub-sampled Newton Methods with Non-uniform Sampling
Peng Xu
Jiyan Yang
Farbod Roosta-Khorasani
Christopher Ré
Michael W. Mahoney
56
115
0
02 Jul 2016
Second-Order Stochastic Optimization for Machine Learning in Linear Time
Naman Agarwal
Brian Bullins
Elad Hazan
ODL
43
102
0
12 Feb 2016
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
110
1,817
0
01 Jul 2014
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
140
738
0
19 Mar 2014
A Stochastic Quasi-Newton Method for Large-Scale Optimization
R. Byrd
Samantha Hansen
J. Nocedal
Y. Singer
ODL
87
471
0
27 Jan 2014
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
119
1,031
0
10 Sep 2012
Convergence Rates of Inexact Proximal-Gradient Methods for Convex Optimization
Mark Schmidt
Nicolas Le Roux
Francis R. Bach
143
582
0
12 Sep 2011
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Andrew Cotter
Ohad Shamir
Nathan Srebro
Karthik Sridharan
ODL
102
313
0
22 Jun 2011
1