ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.04204
14
1

A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization

10 January 2023
Chuan He
Heng Huang
Zhaosong Lu
ArXivPDFHTML
Abstract

In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier-augmented Lagrangian method for finding an approximate SOSP of this problem. Under some mild assumptions, we show that our method enjoys a total inner iteration complexity of O~(ϵ−11/2)\widetilde{\cal O}(\epsilon^{-11/2})O(ϵ−11/2) and an operation complexity of O~(ϵ−11/2min⁡{n,ϵ−5/4})\widetilde{\cal O}(\epsilon^{-11/2}\min\{n,\epsilon^{-5/4}\})O(ϵ−11/2min{n,ϵ−5/4}) for finding an (ϵ,ϵ)(\epsilon,\sqrt{\epsilon})(ϵ,ϵ​)-SOSP of general nonconvex conic optimization with high probability. Moreover, under a constraint qualification, these complexity bounds are improved to O~(ϵ−7/2)\widetilde{\cal O}(\epsilon^{-7/2})O(ϵ−7/2) and O~(ϵ−7/2min⁡{n,ϵ−3/4})\widetilde{\cal O}(\epsilon^{-7/2}\min\{n,\epsilon^{-3/4}\})O(ϵ−7/2min{n,ϵ−3/4}), respectively. To the best of our knowledge, this is the first study on the complexity of finding an approximate SOSP of general nonconvex conic optimization. Preliminary numerical results are presented to demonstrate superiority of the proposed method over first-order methods in terms of solution quality.

View on arXiv
Comments on this paper