19
172

Stochastic Cubic Regularization for Fast Nonconvex Optimization

Abstract

This paper proposes a stochastic variant of a classic algorithm---the cubic-regularized Newton method [Nesterov and Polyak 2006]. The proposed algorithm efficiently escapes saddle points and finds approximate local minima for general smooth, nonconvex functions in only O~(ϵ3.5)\mathcal{\tilde{O}}(\epsilon^{-3.5}) stochastic gradient and stochastic Hessian-vector product evaluations. The latter can be computed as efficiently as stochastic gradients. This improves upon the O~(ϵ4)\mathcal{\tilde{O}}(\epsilon^{-4}) rate of stochastic gradient descent. Our rate matches the best-known result for finding local minima without requiring any delicate acceleration or variance-reduction techniques.

View on arXiv
Comments on this paper