ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.00582
  4. Cited By
Unified Acceleration of High-Order Algorithms under Hölder
  Continuity and Uniform Convexity

Unified Acceleration of High-Order Algorithms under Hölder Continuity and Uniform Convexity

3 June 2019
Chaobing Song
Yong Jiang
Yi Ma
ArXivPDFHTML

Papers citing "Unified Acceleration of High-Order Algorithms under Hölder Continuity and Uniform Convexity"

6 / 6 papers shown
Title
RECAPP: Crafting a More Efficient Catalyst for Convex Optimization
RECAPP: Crafting a More Efficient Catalyst for Convex Optimization
Y. Carmon
A. Jambulapati
Yujia Jin
Aaron Sidford
65
11
0
17 Jun 2022
Distributionally Robust Optimization via Ball Oracle Acceleration
Distributionally Robust Optimization via Ball Oracle Acceleration
Y. Carmon
Danielle Hausler
23
12
0
24 Mar 2022
Stochastic Bias-Reduced Gradient Methods
Stochastic Bias-Reduced Gradient Methods
Hilal Asi
Y. Carmon
A. Jambulapati
Yujia Jin
Aaron Sidford
34
29
0
17 Jun 2021
Cyclic Coordinate Dual Averaging with Extrapolation
Cyclic Coordinate Dual Averaging with Extrapolation
Chaobing Song
Jelena Diakonikolas
32
6
0
26 Feb 2021
Variance Reduction via Accelerated Dual Averaging for Finite-Sum
  Optimization
Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization
Chaobing Song
Yong Jiang
Yi Ma
53
23
0
18 Jun 2020
A Differential Equation for Modeling Nesterov's Accelerated Gradient
  Method: Theory and Insights
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
108
1,157
0
04 Mar 2015
1