ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.07372
27
46

Stochastic Variance-Reduced Cubic Regularization for Nonconvex Optimization

20 February 2018
Zhe Wang
Yi Zhou
Yingbin Liang
Guanghui Lan
ArXivPDFHTML
Abstract

Cubic regularization (CR) is an optimization method with emerging popularity due to its capability to escape saddle points and converge to second-order stationary solutions for nonconvex optimization. However, CR encounters a high sample complexity issue for finite-sum problems with a large data size. %Various inexact variants of CR have been proposed to improve the sample complexity. In this paper, we propose a stochastic variance-reduced cubic-regularization (SVRC) method under random sampling, and study its convergence guarantee as well as sample complexity. We show that the iteration complexity of SVRC for achieving a second-order stationary solution within ϵ\epsilonϵ accuracy is O(ϵ−3/2)O(\epsilon^{-3/2})O(ϵ−3/2), which matches the state-of-art result on CR types of methods. Moreover, our proposed variance reduction scheme significantly reduces the per-iteration sample complexity. The resulting total Hessian sample complexity of our SVRC is \Oc(N2/3ϵ−3/2){\Oc}(N^{2/3} \epsilon^{-3/2})\Oc(N2/3ϵ−3/2), which outperforms the state-of-art result by a factor of O(N2/15)O(N^{2/15})O(N2/15). We also study our SVRC under random sampling without replacement scheme, which yields a lower per-iteration sample complexity, and hence justifies its practical applicability.

View on arXiv
Comments on this paper