ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.05217
  4. Cited By
Tight Analyses for Non-Smooth Stochastic Gradient Descent

Tight Analyses for Non-Smooth Stochastic Gradient Descent

13 December 2018
Nicholas J. A. Harvey
Christopher Liaw
Y. Plan
Sikander Randhawa
ArXiv (abs)PDFHTML

Papers citing "Tight Analyses for Non-Smooth Stochastic Gradient Descent"

11 / 11 papers shown
Title
Sketched Adaptive Federated Deep Learning: A Sharp Convergence Analysis
Sketched Adaptive Federated Deep Learning: A Sharp Convergence Analysis
Zhijie Chen
Qiaobo Li
A. Banerjee
FedML
103
0
0
11 Nov 2024
Nonlinear Stochastic Gradient Descent and Heavy-tailed Noise: A Unified Framework and High-probability Guarantees
Nonlinear Stochastic Gradient Descent and Heavy-tailed Noise: A Unified Framework and High-probability Guarantees
Aleksandar Armacki
Shuhua Yu
Pranay Sharma
Gauri Joshi
Dragana Bajović
D. Jakovetić
S. Kar
105
2
0
17 Oct 2024
Linear Combination of Saved Checkpoints Makes Consistency and Diffusion Models Better
Linear Combination of Saved Checkpoints Makes Consistency and Diffusion Models Better
En-hao Liu
Junyi Zhu
Zinan Lin
Xuefei Ning
Shuaiqi Wang
...
Sergey Yekhanin
Guohao Dai
Huazhong Yang
Yu Wang
Yu Wang
MoMe
165
4
0
02 Apr 2024
Weighted Averaged Stochastic Gradient Descent: Asymptotic Normality and Optimality
Weighted Averaged Stochastic Gradient Descent: Asymptotic Normality and Optimality
Ziyang Wei
Wanrong Zhu
Wei Biao Wu
125
5
0
13 Jul 2023
Introduction to Online Convex Optimization
Introduction to Online Convex Optimization
Elad Hazan
OffRL
198
1,940
0
07 Sep 2019
Privacy Amplification by Iteration
Privacy Amplification by Iteration
Vitaly Feldman
Ilya Mironov
Kunal Talwar
Abhradeep Thakurta
FedML
86
177
0
20 Aug 2018
Potential-Function Proofs for First-Order Methods
Potential-Function Proofs for First-Order Methods
N. Bansal
Anupam Gupta
50
29
0
13 Dec 2017
Minimizing Finite Sums with the Stochastic Average Gradient
Minimizing Finite Sums with the Stochastic Average Gradient
Mark Schmidt
Nicolas Le Roux
Francis R. Bach
331
1,250
0
10 Sep 2013
A simpler approach to obtaining an O(1/t) convergence rate for the
  projected stochastic subgradient method
A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method
Simon Lacoste-Julien
Mark Schmidt
Francis R. Bach
190
261
0
10 Dec 2012
Stochastic Gradient Descent for Non-smooth Optimization: Convergence
  Results and Optimal Averaging Schemes
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
164
576
0
08 Dec 2012
Making Gradient Descent Optimal for Strongly Convex Stochastic
  Optimization
Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization
Alexander Rakhlin
Ohad Shamir
Karthik Sridharan
188
769
0
26 Sep 2011
1