98
20

On the Powerball Method

Claire J. Tomlin
Abstract

We propose a new method to accelerate the convergence of optimization algorithms. This method simply adds a power coefficient γ[0,1)\gamma\in[0,1) to the gradient during optimization. We call this the Powerball method after the well-known Heavy-ball method by Polyak. We analyze the convergence rate for the Powerball method for strongly convex functions and show that it has a faster convergence rate than gradient descent and Newton's method in the initial iterations. We also demonstrate that the Powerball method provides a 1010-fold speed up of the convergence of both gradient descent and L-BFGS on multiple real datasets as well as accelerates the computation for Pagerank vector.

View on arXiv
Comments on this paper