ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.01744
  4. Cited By
Accelerated SGD for Non-Strongly-Convex Least Squares

Accelerated SGD for Non-Strongly-Convex Least Squares

3 March 2022
Aditya Varre
Nicolas Flammarion
ArXivPDFHTML

Papers citing "Accelerated SGD for Non-Strongly-Convex Least Squares"

6 / 6 papers shown
Title
Corner Gradient Descent
Corner Gradient Descent
Dmitry Yarotsky
41
0
0
16 Apr 2025
Increasing Batch Size Improves Convergence of Stochastic Gradient Descent with Momentum
Increasing Batch Size Improves Convergence of Stochastic Gradient Descent with Momentum
Keisuke Kamo
Hideaki Iiduka
70
0
0
15 Jan 2025
SGD with memory: fundamental properties and stochastic acceleration
SGD with memory: fundamental properties and stochastic acceleration
Dmitry Yarotsky
Maksim Velikanov
40
1
0
05 Oct 2024
The Optimality of (Accelerated) SGD for High-Dimensional Quadratic
  Optimization
The Optimality of (Accelerated) SGD for High-Dimensional Quadratic Optimization
Haihan Zhang
Yuanshi Liu
Qianwen Chen
Cong Fang
38
0
0
15 Sep 2024
A view of mini-batch SGD via generating functions: conditions of
  convergence, phase transitions, benefit from negative momenta
A view of mini-batch SGD via generating functions: conditions of convergence, phase transitions, benefit from negative momenta
Maksim Velikanov
Denis Kuznedelev
Dmitry Yarotsky
13
8
0
22 Jun 2022
Non-asymptotic oracle inequalities for the Lasso in high-dimensional
  mixture of experts
Non-asymptotic oracle inequalities for the Lasso in high-dimensional mixture of experts
TrungTin Nguyen
Hien Nguyen
Faicel Chamroukhi
Geoffrey J. McLachlan
26
2
0
22 Sep 2020
1