ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.08167
  4. Cited By
Improved Complexities for Stochastic Conditional Gradient Methods under
  Interpolation-like Conditions

Improved Complexities for Stochastic Conditional Gradient Methods under Interpolation-like Conditions

15 June 2020
Tesi Xiao
Krishnakumar Balasubramanian
Saeed Ghadimi
ArXivPDFHTML

Papers citing "Improved Complexities for Stochastic Conditional Gradient Methods under Interpolation-like Conditions"

3 / 3 papers shown
Title
Faster Convergence of Stochastic Accelerated Gradient Descent under Interpolation
Faster Convergence of Stochastic Accelerated Gradient Descent under Interpolation
Aaron Mishkin
Mert Pilanci
Mark Schmidt
62
1
0
03 Apr 2024
Stochastic Mirror Descent: Convergence Analysis and Adaptive Variants
  via the Mirror Stochastic Polyak Stepsize
Stochastic Mirror Descent: Convergence Analysis and Adaptive Variants via the Mirror Stochastic Polyak Stepsize
Ryan DÓrazio
Nicolas Loizou
I. Laradji
Ioannis Mitliagkas
34
30
0
28 Oct 2021
A Linearly Convergent Conditional Gradient Algorithm with Applications
  to Online and Stochastic Optimization
A Linearly Convergent Conditional Gradient Algorithm with Applications to Online and Stochastic Optimization
Dan Garber
Elad Hazan
61
94
0
20 Jan 2013
1