Optimal rates for first-order stochastic convex optimization under Tsybakov noise condition

We focus on the problem of minimizing a convex function over a convex set given queries to a stochastic first order oracle. We argue that the complexity of convex minimization is only determined by the rate of growth of the function around its minimizer , as quantified by a Tsybakov-like noise condition. Specifically, we prove that if grows at least as fast as around its minimum, for some , then the optimal rate of learning is . The classic rate for convex functions and for strongly convex functions are special cases of our result for and , and even faster rates are attained for . We also derive tight bounds for the complexity of learning , where the optimal rate is . Interestingly, these precise rates for convex optimization also characterize the complexity of active learning and our results further strengthen the connections between the two fields, both of which rely on feedback-driven queries.
View on arXiv