ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.02412
14
4

First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians

5 September 2023
N. Doikov
G. N. Grapiglia
ArXivPDFHTML
Abstract

In this work, we develop first-order (Hessian-free) and zero-order (derivative-free) implementations of the Cubically regularized Newton method for solving general non-convex optimization problems. For that, we employ finite difference approximations of the derivatives. We use a special adaptive search procedure in our algorithms, which simultaneously fits both the regularization constant and the parameters of the finite difference approximations. It makes our schemes free from the need to know the actual Lipschitz constants. Additionally, we equip our algorithms with the lazy Hessian update that reuse a previously computed Hessian approximation matrix for several iterations. Specifically, we prove the global complexity bound of O(n1/2ϵ−3/2)\mathcal{O}( n^{1/2} \epsilon^{-3/2})O(n1/2ϵ−3/2) function and gradient evaluations for our new Hessian-free method, and a bound of O(n3/2ϵ−3/2)\mathcal{O}( n^{3/2} \epsilon^{-3/2} )O(n3/2ϵ−3/2) function evaluations for the derivative-free method, where nnn is the dimension of the problem and ϵ\epsilonϵ is the desired accuracy for the gradient norm. These complexity bounds significantly improve the previously known ones in terms of the joint dependence on nnn and ϵ\epsilonϵ, for the first-order and zeroth-order non-convex optimization.

View on arXiv
Comments on this paper