50
31

Optimal rates for first-order stochastic convex optimization under Tsybakov noise condition

Aaditya Ramdas
Aarti Singh
Abstract

We focus on the problem of minimizing a convex function ff over a convex set SS given TT queries to a stochastic first order oracle. We argue that the complexity of convex minimization is only determined by the rate of growth of the function around its minimizer xf,Sx^*_{f,S}, as quantified by a Tsybakov-like noise condition. Specifically, we prove that if ff grows at least as fast as xxf,Sκ\|x-x^*_{f,S}\|^\kappa around its minimum, for some κ>1\kappa > 1, then the optimal rate of learning f(xf,S)f(x^*_{f,S}) is Θ(Tκ2κ2)\Theta(T^{-\frac{\kappa}{2\kappa-2}}). The classic rate Θ(1/T)\Theta(1/\sqrt T) for convex functions and Θ(1/T)\Theta(1/T) for strongly convex functions are special cases of our result for κ\kappa \rightarrow \infty and κ=2\kappa=2, and even faster rates are attained for κ<2\kappa <2. We also derive tight bounds for the complexity of learning xf,Sx_{f,S}^*, where the optimal rate is Θ(T12κ2)\Theta(T^{-\frac{1}{2\kappa-2}}). Interestingly, these precise rates for convex optimization also characterize the complexity of active learning and our results further strengthen the connections between the two fields, both of which rely on feedback-driven queries.

View on arXiv
Comments on this paper