ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1901.08788
  4. Cited By
Estimate Sequences for Stochastic Composite Optimization: Variance
  Reduction, Acceleration, and Robustness to Noise

Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise

25 January 2019
A. Kulunchakov
Julien Mairal
ArXivPDFHTML

Papers citing "Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise"

5 / 5 papers shown
Title
Demystifying SGD with Doubly Stochastic Gradients
Demystifying SGD with Doubly Stochastic Gradients
Kyurae Kim
Joohwan Ko
Yian Ma
Jacob R. Gardner
53
0
0
03 Jun 2024
Tight and Robust Private Mean Estimation with Few Users
Tight and Robust Private Mean Estimation with Few Users
Cheng-Han Chiang
Vahab Mirrokni
Hung-yi Lee
FedML
31
28
0
22 Oct 2021
Variance Reduction via Accelerated Dual Averaging for Finite-Sum
  Optimization
Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization
Chaobing Song
Yong Jiang
Yi Ma
53
23
0
18 Jun 2020
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
90
736
0
19 Mar 2014
Incremental Majorization-Minimization Optimization with Application to
  Large-Scale Machine Learning
Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
Julien Mairal
79
317
0
18 Feb 2014
1