ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1706.07001
  4. Cited By
Improved Optimization of Finite Sums with Minibatch Stochastic Variance
  Reduced Proximal Iterations

Improved Optimization of Finite Sums with Minibatch Stochastic Variance Reduced Proximal Iterations

21 June 2017
Jialei Wang
Tong Zhang
ArXivPDFHTML

Papers citing "Improved Optimization of Finite Sums with Minibatch Stochastic Variance Reduced Proximal Iterations"

9 / 9 papers shown
Title
Oracle Complexity of Second-Order Methods for Finite-Sum Problems
Oracle Complexity of Second-Order Methods for Finite-Sum Problems
Yossi Arjevani
Ohad Shamir
69
24
0
15 Nov 2016
Sub-sampled Newton Methods with Non-uniform Sampling
Sub-sampled Newton Methods with Non-uniform Sampling
Peng Xu
Jiyan Yang
Farbod Roosta-Khorasani
Christopher Ré
Michael W. Mahoney
56
115
0
02 Jul 2016
Second-Order Stochastic Optimization for Machine Learning in Linear Time
Second-Order Stochastic Optimization for Machine Learning in Linear Time
Naman Agarwal
Brian Bullins
Elad Hazan
ODL
43
102
0
12 Feb 2016
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly
  Convex Composite Objectives
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
110
1,817
0
01 Jul 2014
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
140
738
0
19 Mar 2014
A Stochastic Quasi-Newton Method for Large-Scale Optimization
A Stochastic Quasi-Newton Method for Large-Scale Optimization
R. Byrd
Samantha Hansen
J. Nocedal
Y. Singer
ODL
87
471
0
27 Jan 2014
Stochastic Dual Coordinate Ascent Methods for Regularized Loss
  Minimization
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
119
1,031
0
10 Sep 2012
Convergence Rates of Inexact Proximal-Gradient Methods for Convex
  Optimization
Convergence Rates of Inexact Proximal-Gradient Methods for Convex Optimization
Mark Schmidt
Nicolas Le Roux
Francis R. Bach
143
582
0
12 Sep 2011
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Andrew Cotter
Ohad Shamir
Nathan Srebro
Karthik Sridharan
ODL
102
313
0
22 Jun 2011
1