ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.08167
15
2

Improved Complexities for Stochastic Conditional Gradient Methods under Interpolation-like Conditions

15 June 2020
Tesi Xiao
Krishnakumar Balasubramanian
Saeed Ghadimi
ArXivPDFHTML
Abstract

We analyze stochastic conditional gradient methods for constrained optimization problems arising in over-parametrized machine learning. We show that one could leverage the interpolation-like conditions satisfied by such models to obtain improved oracle complexities. Specifically, when the objective function is convex, we show that the conditional gradient method requires O(ϵ−2)\mathcal{O}(\epsilon^{-2})O(ϵ−2) calls to the stochastic gradient oracle to find an ϵ\epsilonϵ-optimal solution. Furthermore, by including a gradient sliding step, we show that the number of calls reduces to O(ϵ−1.5)\mathcal{O}(\epsilon^{-1.5})O(ϵ−1.5).

View on arXiv
Comments on this paper