A Bregman Extension of quasi-Newton updates II: Convergence and Robustness Properties

We propose an extension of quasi-Newton methods, and investigate the convergence and the robustness properties of the proposed update formulae for the approximate Hessian matrix. Fletcher has studied a variational problem which derives the approximate Hessian update formula of the quasi-Newton methods. We point out that the variational problem is identical to optimization of the Kullback-Leibler divergence, which is a discrepancy measure between two probability distributions. Then, we introduce the Bregman divergence as an extension of the Kullback-Leibler divergence, and derive extended quasi-Newton update formulae based on the variational problem with the Bregman divergence. The proposed update formulae belong to a class of self-scaling quasi-Newton methods. We study the convergence property of the proposed quasi-Newton method, and moreover, we apply the tools in the robust statistics to analyze the robustness property of the Hessian update formulae against the numerical rounding errors included in the line search for the step length. As the result, we found that the influence of the inexact line search is bounded only for the standard BFGS formula for the Hessian approximation. Numerical studies are conducted to verify the usefulness of the tools borrowed from robust statistics.
View on arXiv