13
4

The Lingering of Gradients: Theory and Applications

Abstract

Classically, the time complexity of a first-order method is estimated by its number of gradient computations. In this paper, we study a more refined complexity by taking into account the `lingering' of gradients: once a gradient is computed at xkx_k, the additional time to compute gradients at xk+1,xk+2,x_{k+1},x_{k+2},\dots may be reduced. We show how this improves the running time of several first-order methods. For instance, if the `additional time' scales linearly with respect to the traveled distance, then the `convergence rate' of gradient descent can be improved from 1/T1/T to exp(T1/3)\exp(-T^{1/3}). On the application side, we solve a hypothetical revenue management problem on the Yahoo! Front Page Today Module with 4.6m users to 10610^{-6} error using only 6 passes of the dataset; and solve a real-life support vector machine problem to an accuracy that is two orders of magnitude better comparing to the state-of-the-art algorithm.

View on arXiv
Comments on this paper