74
12

DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression

Abstract

Multinomial logistic regression is a popular tool in the arsenal of machine learning algorithms, yet scaling it to datasets with very large number of data points and classes has not been trivial. This is primarily because one needs to compute the log-partition function on every data point. This makes distributing the computation hard. In this paper, we present a distributed stochastic gradient descent based optimization method (DS-MLR) for scaling up multinomial logistic regression problems to very large data. Our algorithm exploits double-separability, an attractive property we observe in the objective functions of several models in machine learning, that allows us to achieve both data as well as model parallelism simultaneously. In addition to being parallelizable, our algorithm can also easily be made asynchronous. In order to demonstrate the effectiveness of our method, we solve a very large multi-class classification problem on the reddit dataset with data and parameter sizes of 200 GB and 300 GB respectively. Such a scale of data calls for simultaneous data and model parallelism which is where DS-MLR fits in.

View on arXiv
Comments on this paper