ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.04796
16
45

Stochastic Variance-Reduced Cubic Regularized Newton Method

13 February 2018
Dongruo Zhou
Pan Xu
Quanquan Gu
    ODL
ArXivPDFHTML
Abstract

We propose a stochastic variance-reduced cubic regularized Newton method for non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an (ϵ,ϵ)(\epsilon,\sqrt{\epsilon})(ϵ,ϵ​)-approximately local minimum within O~(n4/5/ϵ3/2)\tilde{O}(n^{4/5}/\epsilon^{3/2})O~(n4/5/ϵ3/2) second-order oracle calls, which outperforms the state-of-the-art cubic regularization algorithms including subsampled cubic regularization. Our work also sheds light on the application of variance reduction technique to high-order non-convex optimization methods. Thorough experiments on various non-convex optimization problems support our theory.

View on arXiv
Comments on this paper