684

An adaptive low dimensional quasi-Newton sum of functions optimizer

International Conference on Machine Learning (ICML), 2013
Abstract

We present an algorithm for minimizing a sum of functions that combines the computational efficiency of stochastic gradient descent (SGD) with the second order curvature information accessible by quasi-Newton methods. We unify these disparate approaches by maintaining an independent Hessian approximation for each contributing function in the sum. We maintain computational tractability and limit memory requirements even for high dimensional optimization problems by storing and manipulating these quadratic approximations in a shared time evolving low dimensional subspace. This algorithm contrasts with earlier stochastic second order techniques, that treat the Hessian of each contributing function as a noisy approximation to the full Hessian, rather than as a target for direct estimation. Our approach reaps the benefits of both SGD and quasi-Newton methods; each update step requires only a single subfunction evaluation (as in SGD), while each step is scaled using an approximate inverse Hessian and little to no adjustment of hyperparameters is required (as is typical for quasi-Newton methods). We experimentally demonstrate improved convergence on six diverse optimization problems.

View on arXiv
Comments on this paper