ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1604.04706
  4. Cited By
DS-MLR: Exploiting Double Separability for Scaling up Distributed
  Multinomial Logistic Regression
v1v2v3v4v5v6v7 (latest)

DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression

16 April 2016
Parameswaran Raman
Sriram Srinivasan
Shin Matsushima
Xinhua Zhang
Hyokun Yun
S.V.N. Vishwanathan
ArXiv (abs)PDFHTML

Papers citing "DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression"

4 / 4 papers shown
Title
Ranking via Robust Binary Classification and Parallel Parameter
  Estimation in Large-Scale Data
Ranking via Robust Binary Classification and Parallel Parameter Estimation in Large-Scale Data
Hyokun Yun
Parameswaran Raman
S.V.N. Vishwanathan
106
28
0
11 Feb 2014
ADADELTA: An Adaptive Learning Rate Method
ADADELTA: An Adaptive Learning Rate Method
Matthew D. Zeiler
ODL
165
6,635
0
22 Dec 2012
Practical recommendations for gradient-based training of deep
  architectures
Practical recommendations for gradient-based training of deep architectures
Yoshua Bengio
3DHODL
197
2,203
0
24 Jun 2012
HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient
  Descent
HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
Feng Niu
Benjamin Recht
Christopher Ré
Stephen J. Wright
216
2,273
0
28 Jun 2011
1