Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.01845
Cited By
The Complexity of Finding Stationary Points with Stochastic Gradient Descent
4 October 2019
Yoel Drori
Shigehito Shimizu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"The Complexity of Finding Stationary Points with Stochastic Gradient Descent"
11 / 11 papers shown
Title
Variance Reduction Methods Do Not Need to Compute Full Gradients: Improved Efficiency through Shuffling
Daniil Medyakov
Gleb Molodtsov
S. Chezhegov
Alexey Rebrikov
Aleksandr Beznosikov
109
0
0
21 Feb 2025
Generalizing Reward Modeling for Out-of-Distribution Preference Learning
Chen Jia
44
2
0
22 Feb 2024
OptEx: Expediting First-Order Optimization with Approximately Parallelized Iterations
Yao Shu
Jiongfeng Fang
Y. He
Fei Richard Yu
35
0
0
18 Feb 2024
Accelerating Value Iteration with Anchoring
Jongmin Lee
Ernest K. Ryu
24
7
0
26 May 2023
How to escape sharp minima with random perturbations
Kwangjun Ahn
Ali Jadbabaie
S. Sra
ODL
34
6
0
25 May 2023
Two Sides of One Coin: the Limits of Untuned SGD and the Power of Adaptive Methods
Junchi Yang
Xiang Li
Ilyas Fatkhullin
Niao He
47
15
0
21 May 2023
A simplified convergence theory for Byzantine resilient stochastic gradient descent
Lindon Roberts
E. Smyth
31
3
0
25 Aug 2022
Distributed stochastic proximal algorithm with random reshuffling for non-smooth finite-sum optimization
Xia Jiang
Xianlin Zeng
Jian Sun
Jie Chen
Lihua Xie
18
6
0
06 Nov 2021
Random Reshuffling: Simple Analysis with Vast Improvements
Konstantin Mishchenko
Ahmed Khaled
Peter Richtárik
37
131
0
10 Jun 2020
Learning Halfspaces with Massart Noise Under Structured Distributions
Ilias Diakonikolas
Vasilis Kontonis
Christos Tzamos
Nikos Zarifis
32
59
0
13 Feb 2020
Better Theory for SGD in the Nonconvex World
Ahmed Khaled
Peter Richtárik
13
180
0
09 Feb 2020
1