ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.12061
  4. Cited By
Boosting First-Order Methods by Shifting Objective: New Schemes with
  Faster Worst-Case Rates

Boosting First-Order Methods by Shifting Objective: New Schemes with Faster Worst-Case Rates

25 May 2020
Kaiwen Zhou
Anthony Man-Cho So
James Cheng
ArXivPDFHTML

Papers citing "Boosting First-Order Methods by Shifting Objective: New Schemes with Faster Worst-Case Rates"

4 / 4 papers shown
Title
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly
  Convex Composite Objectives
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
110
1,817
0
01 Jul 2014
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
140
738
0
19 Mar 2014
Minimizing Finite Sums with the Stochastic Average Gradient
Minimizing Finite Sums with the Stochastic Average Gradient
Mark Schmidt
Nicolas Le Roux
Francis R. Bach
257
1,246
0
10 Sep 2013
Stochastic Dual Coordinate Ascent Methods for Regularized Loss
  Minimization
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
Shai Shalev-Shwartz
Tong Zhang
117
1,031
0
10 Sep 2012
1