ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1605.03529
18
33

On the Iteration Complexity of Oblivious First-Order Optimization Algorithms

11 May 2016
Yossi Arjevani
Ohad Shamir
ArXivPDFHTML
Abstract

We consider a broad class of first-order optimization algorithms which are \emph{oblivious}, in the sense that their step sizes are scheduled regardless of the function under consideration, except for limited side-information such as smoothness or strong convexity parameters. With the knowledge of these two parameters, we show that any such algorithm attains an iteration complexity lower bound of Ω(L/ϵ)\Omega(\sqrt{L/\epsilon})Ω(L/ϵ​) for LLL-smooth convex functions, and Ω~(L/μln⁡(1/ϵ))\tilde{\Omega}(\sqrt{L/\mu}\ln(1/\epsilon))Ω~(L/μ​ln(1/ϵ)) for LLL-smooth μ\muμ-strongly convex functions. These lower bounds are stronger than those in the traditional oracle model, as they hold independently of the dimension. To attain these, we abandon the oracle model in favor of a structure-based approach which builds upon a framework recently proposed in (Arjevani et al., 2015). We further show that without knowing the strong convexity parameter, it is impossible to attain an iteration complexity better than Ω~((L/μ)ln⁡(1/ϵ))\tilde{\Omega}\left((L/\mu)\ln(1/\epsilon)\right)Ω~((L/μ)ln(1/ϵ)). This result is then used to formalize an observation regarding LLL-smooth convex functions, namely, that the iteration complexity of algorithms employing time-invariant step sizes must be at least Ω(L/ϵ)\Omega(L/\epsilon)Ω(L/ϵ).

View on arXiv
Comments on this paper