19
26

Stochastic Recursive Variance-Reduced Cubic Regularization Methods

Dongruo Zhou
Quanquan Gu
Abstract

Stochastic Variance-Reduced Cubic regularization (SVRC) algorithms have received increasing attention due to its improved gradient/Hessian complexities (i.e., number of queries to stochastic gradient/Hessian oracles) to find local minima for nonconvex finite-sum optimization. However, it is unclear whether existing SVRC algorithms can be further improved. Moreover, the semi-stochastic Hessian estimator adopted in existing SVRC algorithms prevents the use of Hessian-vector product-based fast cubic subproblem solvers, which makes SVRC algorithms computationally intractable for high-dimensional problems. In this paper, we first present a Stochastic Recursive Variance-Reduced Cubic regularization method (SRVRC) using a recursively updated semi-stochastic gradient and Hessian estimators. It enjoys improved gradient and Hessian complexities to find an (ϵ,ϵ)(\epsilon, \sqrt{\epsilon})-approximate local minimum, and outperforms the state-of-the-art SVRC algorithms. Built upon SRVRC, we further propose a Hessian-free SRVRC algorithm, namely SRVRCfree_{\text{free}}, which only requires stochastic gradient and Hessian-vector product computations, and achieves O~(dnϵ2dϵ3)\tilde O(dn\epsilon^{-2} \land d\epsilon^{-3}) runtime complexity, where nn is the number of component functions in the finite-sum structure, dd is the problem dimension, and ϵ\epsilon is the optimization precision. This outperforms the best-known runtime complexity O~(dϵ3.5)\tilde O(d\epsilon^{-3.5}) achieved by stochastic cubic regularization algorithm proposed in Tripuraneni et al. 2018.

View on arXiv
Comments on this paper