ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.08496
  4. Cited By
Accelerating Mini-batch SARAH by Step Size Rules

Accelerating Mini-batch SARAH by Step Size Rules

20 June 2019
Zhuang Yang
Zengping Chen
Cheng-Yu Wang
ArXivPDFHTML

Papers citing "Accelerating Mini-batch SARAH by Step Size Rules"

4 / 4 papers shown
Title
Sarah Frank-Wolfe: Methods for Constrained Optimization with Best Rates
  and Practical Features
Sarah Frank-Wolfe: Methods for Constrained Optimization with Best Rates and Practical Features
Aleksandr Beznosikov
David Dobre
Gauthier Gidel
30
5
0
23 Apr 2023
SARAH-based Variance-reduced Algorithm for Stochastic Finite-sum
  Cocoercive Variational Inequalities
SARAH-based Variance-reduced Algorithm for Stochastic Finite-sum Cocoercive Variational Inequalities
Aleksandr Beznosikov
Alexander Gasnikov
45
2
0
12 Oct 2022
Random-reshuffled SARAH does not need a full gradient computations
Random-reshuffled SARAH does not need a full gradient computations
Aleksandr Beznosikov
Martin Takáč
31
7
0
26 Nov 2021
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
93
737
0
19 Mar 2014
1