Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.09265
Cited By
SSRGD: Simple Stochastic Recursive Gradient Descent for Escaping Saddle Points
19 April 2019
Zhize Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"SSRGD: Simple Stochastic Recursive Gradient Descent for Escaping Saddle Points"
10 / 10 papers shown
Title
Second-Order Convergence in Private Stochastic Non-Convex Optimization
Youming Tao
Zuyuan Zhang
Dongxiao Yu
Xiuzhen Cheng
Falko Dressler
Di Wang
12
0
0
21 May 2025
Probabilistic Guarantees of Stochastic Recursive Gradient in Non-Convex Finite Sum Problems
Yanjie Zhong
Jiaqi Li
Soumendra Lahiri
34
1
0
29 Jan 2024
Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization
Le‐Yu Chen
Jing Xu
Luo Luo
36
15
0
16 Jan 2023
Tackling benign nonconvexity with smoothing and stochastic gradients
Harsh Vardhan
Sebastian U. Stich
31
8
0
18 Feb 2022
Escape saddle points by a simple gradient-descent based algorithm
Chenyi Zhang
Tongyang Li
ODL
31
15
0
28 Nov 2021
Faster Perturbed Stochastic Gradient Methods for Finding Local Minima
Zixiang Chen
Dongruo Zhou
Quanquan Gu
43
1
0
25 Oct 2021
ANITA: An Optimal Loopless Accelerated Variance-Reduced Gradient Method
Zhize Li
48
14
0
21 Mar 2021
Stochastic Gradient Langevin Dynamics with Variance Reduction
Zhishen Huang
Stephen Becker
17
7
0
12 Feb 2021
PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization
Zhize Li
Hongyan Bao
Xiangliang Zhang
Peter Richtárik
ODL
36
126
0
25 Aug 2020
A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization
Quoc Tran-Dinh
Nhan H. Pham
T. Dzung
Lam M. Nguyen
27
49
0
08 Jul 2019
1