ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.03180
  4. Cited By
Shuffling Momentum Gradient Algorithm for Convex Optimization

Shuffling Momentum Gradient Algorithm for Convex Optimization

5 March 2024
Trang H. Tran
Quoc Tran-Dinh
Lam M. Nguyen
ArXivPDFHTML

Papers citing "Shuffling Momentum Gradient Algorithm for Convex Optimization"

11 / 11 papers shown
Title
On the Convergence of mSGD and AdaGrad for Stochastic Optimization
On the Convergence of mSGD and AdaGrad for Stochastic Optimization
Ruinan Jin
Yu Xing
Xingkang He
42
11
0
26 Jan 2022
SMG: A Shuffling Gradient-Based Method with Momentum
SMG: A Shuffling Gradient-Based Method with Momentum
Trang H. Tran
Lam M. Nguyen
Quoc Tran-Dinh
56
21
0
24 Nov 2020
SGD with shuffling: optimal rates without component convexity and large
  epoch requirements
SGD with shuffling: optimal rates without component convexity and large epoch requirements
Kwangjun Ahn
Chulhee Yun
S. Sra
47
66
0
12 Jun 2020
Random Reshuffling: Simple Analysis with Vast Improvements
Random Reshuffling: Simple Analysis with Vast Improvements
Konstantin Mishchenko
Ahmed Khaled
Peter Richtárik
62
131
0
10 Jun 2020
A Unified Convergence Analysis for Shuffling-Type Gradient Methods
A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Lam M. Nguyen
Quoc Tran-Dinh
Dzung Phan
Phuong Ha Nguyen
Marten van Dijk
83
78
0
19 Feb 2020
SGD without Replacement: Sharper Rates for General Smooth Convex
  Functions
SGD without Replacement: Sharper Rates for General Smooth Convex Functions
Prateek Jain
Dheeraj M. Nagaraj
Praneeth Netrapalli
50
87
0
04 Mar 2019
Random Shuffling Beats SGD after Finite Epochs
Random Shuffling Beats SGD after Finite Epochs
Jeff Z. HaoChen
S. Sra
49
98
0
26 Jun 2018
SGDR: Stochastic Gradient Descent with Warm Restarts
SGDR: Stochastic Gradient Descent with Warm Restarts
I. Loshchilov
Frank Hutter
ODL
288
8,091
0
13 Aug 2016
On the Influence of Momentum Acceleration on Online Learning
On the Influence of Momentum Acceleration on Online Learning
Kun Yuan
Bicheng Ying
Ali H. Sayed
57
58
0
14 Mar 2016
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly
  Convex Composite Objectives
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
131
1,823
0
01 Jul 2014
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Andrew Cotter
Ohad Shamir
Nathan Srebro
Karthik Sridharan
ODL
112
313
0
22 Jun 2011
1