545

Communication Efficient Distributed Optimization using an Approximate Newton-type Method

International Conference on Machine Learning (ICML), 2013
Abstract

We present a novel Newton-type method for distributed optimization, which converges to the empirical optimum of distributed stochastic optimization problems using a small number of simple communication rounds. For quadratic objectives, the number of communication rounds provably scales \emph{down} with the data size, and is essentially a constant under reasonable assumptions. We also present a looser analysis for the non-quadratic case, and discuss the advantages of our approach compared to a recent single-round-of-communication algorithm.

View on arXiv
Comments on this paper