ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.01839
37
2

Complexity of Minimizing Projected-Gradient-Dominated Functions with Stochastic First-order Oracles

3 August 2024
Saeed Masiha
Saber Salehkaleybar
Niao He
Negar Kiyavash
Patrick Thiran
ArXivPDFHTML
Abstract

This work investigates the performance limits of projected stochastic first-order methods for minimizing functions under the (α,τ,X)(\alpha,\tau,\mathcal{X})(α,τ,X)-projected-gradient-dominance property, that asserts the sub-optimality gap F(x)−min⁡x′∈XF(x′)F(\mathbf{x})-\min_{\mathbf{x}'\in \mathcal{X}}F(\mathbf{x}')F(x)−minx′∈X​F(x′) is upper-bounded by τ⋅∥Gη,X(x)∥α\tau\cdot\|\mathcal{G}_{\eta,\mathcal{X}}(\mathbf{x})\|^{\alpha}τ⋅∥Gη,X​(x)∥α for some α∈[1,2)\alpha\in[1,2)α∈[1,2) and τ>0\tau>0τ>0 and Gη,X(x)\mathcal{G}_{\eta,\mathcal{X}}(\mathbf{x})Gη,X​(x) is the projected-gradient mapping with η>0\eta>0η>0 as a parameter. For non-convex functions, we show that the complexity lower bound of querying a batch smooth first-order stochastic oracle to obtain an ϵ\epsilonϵ-global-optimum point is Ω(ϵ−2/α)\Omega(\epsilon^{-{2}/{\alpha}})Ω(ϵ−2/α). Furthermore, we show that a projected variance-reduced first-order algorithm can obtain the upper complexity bound of O(ϵ−2/α)\mathcal{O}(\epsilon^{-{2}/{\alpha}})O(ϵ−2/α), matching the lower bound. For convex functions, we establish a complexity lower bound of Ω(log⁡(1/ϵ)⋅ϵ−2/α)\Omega(\log(1/\epsilon)\cdot\epsilon^{-{2}/{\alpha}})Ω(log(1/ϵ)⋅ϵ−2/α) for minimizing functions under a local version of gradient-dominance property, which also matches the upper complexity bound of accelerated stochastic subgradient methods.

View on arXiv
Comments on this paper