ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1604.04706
74
12
v1v2v3v4v5v6v7 (latest)

DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression

16 April 2016
Parameswaran Raman
Sriram Srinivasan
Shin Matsushima
Xinhua Zhang
Hyokun Yun
ArXiv (abs)PDFHTML
Abstract

Multinomial logistic regression is a popular tool in the arsenal of machine learning algorithms, yet scaling it to datasets with very large number of data points and classes has not been trivial. This is primarily because one needs to compute the log-partition function on every data point. This makes distributing the computation hard. In this paper, we present a distributed stochastic gradient descent based optimization method (DS-MLR) for scaling up multinomial logistic regression problems to very large data. Our algorithm exploits double-separability, an attractive property we observe in the objective functions of several models in machine learning, that allows us to distribute both data and model parameters simultaneously across multiple machines. In addition to being easily parallelizable, our algorithm achieves good test accuracy within a short period of time, with a low overall time and memory footprint as demonstrated by empirical results on both single and multi-machine settings. For instance, on a dataset with 93,805 training instances and 12,294 classes, we achieve close to optimal f-score in 10,000 seconds using 2 machines each having 12 cores.

View on arXiv
Comments on this paper