We study the smooth minimax optimization problem , where is -smooth, strongly-concave in but possibly nonconvex in . Most of existing works focus on finding the first-order stationary points of the function or its primal function , but few of them focus on achieving second-order stationary points. In this paper, we propose a novel approach for minimax optimization, called Minimax Cubic Newton (MCN), which could find an -second-order stationary point of with calling times of second-order oracles and times of first-order oracles, where is the condition number and is the Lipschitz continuous constant for the Hessian of . In addition, we propose an inexact variant of MCN for high-dimensional problems to avoid calling expensive second-order oracles. Instead, our method solves the cubic sub-problem inexactly via gradient descent and matrix Chebyshev expansion. This strategy still obtains the desired approximate second-order stationary point with high probability but only requires Hessian-vector oracle calls and first-order oracle calls. To the best of our knowledge, this is the first work that considers the non-asymptotic convergence behavior of finding second-order stationary points for minimax problems without the convex-concave assumptions.
View on arXiv