In this paper, we provide near-optimal accelerated first-order methods for minimizing a broad class of smooth nonconvex functions that are strictly unimodal on all lines through a minimizer. This function class, which we call the class of smooth quasar-convex functions, is parameterized by a constant , where encompasses the classes of smooth convex and star-convex functions, and smaller values of indicate that the function can be "more nonconvex." We develop a variant of accelerated gradient descent that computes an -approximate minimizer of a smooth -quasar-convex function with at most total function and gradient evaluations. We also derive a lower bound of on the worst-case number of gradient evaluations required by any deterministic first-order method, showing that, up to a logarithmic factor, no deterministic first-order method can improve upon ours.
View on arXiv