Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.09342
Cited By
Random Reshuffling with Variance Reduction: New Analysis and Better Rates
19 April 2021
Grigory Malinovsky
Alibek Sailanbayev
Peter Richtárik
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Random Reshuffling with Variance Reduction: New Analysis and Better Rates"
7 / 7 papers shown
Title
Random Reshuffling: Simple Analysis with Vast Improvements
Konstantin Mishchenko
Ahmed Khaled
Peter Richtárik
58
131
0
10 Jun 2020
A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Lam M. Nguyen
Quoc Tran-Dinh
Dzung Phan
Phuong Ha Nguyen
Marten van Dijk
63
78
0
19 Feb 2020
SGD without Replacement: Sharper Rates for General Smooth Convex Functions
Prateek Jain
Dheeraj M. Nagaraj
Praneeth Netrapalli
43
87
0
04 Mar 2019
Don't Jump Through Hoops and Remove Those Loops: SVRG and Katyusha are Better Without the Outer Loop
D. Kovalev
Samuel Horváth
Peter Richtárik
65
156
0
24 Jan 2019
Stochastic Learning under Random Reshuffling with Constant Step-sizes
Bicheng Ying
Kun Yuan
Stefan Vlaski
Ali H. Sayed
40
36
0
21 Mar 2018
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
97
1,817
0
01 Jul 2014
Practical recommendations for gradient-based training of deep architectures
Yoshua Bengio
3DH
ODL
129
2,195
0
24 Jun 2012
1