Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2105.01650
Cited By
Stochastic gradient descent with noise of machine learning type. Part I: Discrete time analysis
4 May 2021
Stephan Wojtowytsch
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Stochastic gradient descent with noise of machine learning type. Part I: Discrete time analysis"
26 / 26 papers shown
Title
Mask in the Mirror: Implicit Sparsification
Tom Jacobs
R. Burkholz
128
4
0
19 Aug 2024
Almost sure convergence rates of stochastic gradient methods under gradient domination
Simon Weissmann
Sara Klein
Waïss Azizian
Leif Döring
60
3
0
22 May 2024
Stochastic gradient descent with noise of machine learning type. Part II: Continuous time analysis
Stephan Wojtowytsch
63
34
0
04 Jun 2021
Convergence of stochastic gradient descent schemes for Lojasiewicz-landscapes
Steffen Dereich
Sebastian Kassing
62
27
0
16 Feb 2021
On the Almost Sure Convergence of Stochastic Gradient Descent in Non-Convex Problems
P. Mertikopoulos
Nadav Hallak
Ali Kavis
Volkan Cevher
48
88
0
19 Jun 2020
Stopping Criteria for, and Strong Convergence of, Stochastic Gradient Descent on Bottou-Curtis-Nocedal Functions
V. Patel
46
23
0
01 Apr 2020
Explore Aggressively, Update Conservatively: Stochastic Extragradient Methods with Variable Stepsize Scaling
Yu-Guan Hsieh
F. Iutzeler
J. Malick
P. Mertikopoulos
58
68
0
23 Mar 2020
A Simple Convergence Proof of Adam and Adagrad
Alexandre Défossez
Léon Bottou
Francis R. Bach
Nicolas Usunier
104
155
0
05 Mar 2020
Loss Landscape Sightseeing with Multi-Point Optimization
Ivan Skorokhodov
Andrey Kravchenko
3DPC
42
18
0
09 Oct 2019
Linear Convergence of Adaptive Stochastic Gradient Descent
Yuege Xie
Xiaoxia Wu
Rachel A. Ward
54
45
0
28 Aug 2019
On the convergence of single-call stochastic extra-gradient methods
Yu-Guan Hsieh
F. Iutzeler
J. Malick
P. Mertikopoulos
50
169
0
22 Aug 2019
Unified Optimal Analysis of the (Stochastic) Gradient Method
Sebastian U. Stich
54
113
0
09 Jul 2019
Convergence rates for the stochastic gradient descent method for non-convex objective functions
Benjamin J. Fehrman
Benjamin Gess
Arnulf Jentzen
63
101
0
02 Apr 2019
On exponential convergence of SGD in non-convex over-parametrized learning
Xinhai Liu
M. Belkin
Yu-Shen Liu
63
103
0
06 Nov 2018
Fast and Faster Convergence of SGD for Over-Parameterized Models and an Accelerated Perceptron
Sharan Vaswani
Francis R. Bach
Mark Schmidt
70
298
0
16 Oct 2018
AdaGrad stepsizes: Sharp convergence over nonconvex landscapes
Rachel A. Ward
Xiaoxia Wu
Léon Bottou
ODL
59
365
0
05 Jun 2018
The loss landscape of overparameterized neural networks
Y. Cooper
38
75
0
26 Apr 2018
Natasha 2: Faster Non-Convex Optimization Than SGD
Zeyuan Allen-Zhu
ODL
64
245
0
29 Aug 2017
Bridging the Gap between Constant Step Size Stochastic Gradient Descent and Markov Chains
Aymeric Dieuleveut
Alain Durmus
Francis R. Bach
42
156
0
20 Jul 2017
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark Schmidt
242
1,216
0
16 Aug 2016
Optimization Methods for Large-Scale Machine Learning
Léon Bottou
Frank E. Curtis
J. Nocedal
209
3,202
0
15 Jun 2016
Stochastic modified equations and adaptive stochastic gradient algorithms
Qianxiao Li
Cheng Tai
E. Weinan
59
284
0
19 Nov 2015
Stochastic Gradient Descent, Weighted Sampling, and the Randomized Kaczmarz algorithm
Deanna Needell
Nathan Srebro
Rachel A. Ward
119
551
0
21 Oct 2013
Stochastic First- and Zeroth-order Methods for Nonconvex Stochastic Programming
Saeed Ghadimi
Guanghui Lan
ODL
115
1,547
0
22 Sep 2013
Non-strongly-convex smooth stochastic approximation with convergence rate O(1/n)
Francis R. Bach
Eric Moulines
87
405
0
10 Jun 2013
Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization
Alexander Rakhlin
Ohad Shamir
Karthik Sridharan
141
767
0
26 Sep 2011
1