We propose in this paper New Q-Newton's method. The update rule is very simple conceptually, for example where , with and . Here is an appropriate real number so that is invertible, and are projections to the vector subspaces generated by eigenvectors of positive (correspondingly negative) eigenvalues of . The main result of this paper roughly says that if is (can be unbounded from below) and a sequence , constructed by the New Q-Newton's method from a random initial point , {\bf converges}, then the limit point is a critical point and is not a saddle point, and the convergence rate is the same as that of Newton's method. The first author has recently been successful incorporating Backtracking line search to New Q-Newton's method, thus resolving the convergence guarantee issue observed for some (non-smooth) cost functions. An application to quickly finding zeros of a univariate meromorphic function will be discussed. Various experiments are performed, against well known algorithms such as BFGS and Adaptive Cubic Regularization are presented.
View on arXiv