Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1604.04706
Cited By
v1
v2
v3
v4
v5
v6
v7 (latest)
DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression
16 April 2016
Parameswaran Raman
Sriram Srinivasan
Shin Matsushima
Xinhua Zhang
Hyokun Yun
S.V.N. Vishwanathan
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression"
4 / 4 papers shown
Title
Ranking via Robust Binary Classification and Parallel Parameter Estimation in Large-Scale Data
Hyokun Yun
Parameswaran Raman
S.V.N. Vishwanathan
106
28
0
11 Feb 2014
ADADELTA: An Adaptive Learning Rate Method
Matthew D. Zeiler
ODL
165
6,635
0
22 Dec 2012
Practical recommendations for gradient-based training of deep architectures
Yoshua Bengio
3DH
ODL
197
2,203
0
24 Jun 2012
HOGWILD!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
Feng Niu
Benjamin Recht
Christopher Ré
Stephen J. Wright
216
2,273
0
28 Jun 2011
1