38
16

On Coresets For Regularized Regression

Abstract

We study the effect of norm based regularization on the size of coresets for the regularized regression problems. Specifically, given a matrix ARn×d \mathbf{A} \in {\mathbb{R}}^{n \times d} with ndn\gg d and a vector bRn\mathbf{b} \in \mathbb{R} ^ n and λ>0\lambda > 0, we analyze the size of coresets for regularized versions of regression of the form Axbpr+λxqs\|\mathbf{Ax}-\mathbf{b}\|_p^r + \lambda\|{\mathbf{x}}\|_q^s . It has been shown for the case of ridge regression (p,q,r,s=2p,q,r,s=2) that we can obtain a coreset smaller than the coreset for its unregularized counterpart i.e. least squares regression (Avron et al.). We show that when rsr \neq s, no coreset for some regularized regression can have size smaller than the optimal coreset of the unregularized version. The well known lasso problem falls under this category and hence does not allow a coreset smaller than the one for least squares regression. We propose a modified version of the lasso problem and obtain for it a coreset of size smaller than the least square regression. We empirically show that the modified version of lasso also induces sparsity in solution like the lasso. We also obtain smaller coresets for p\ell_p regression with p\ell_p regularization. We extend our methods to multi response regularized regression. Finally, we empirically demonstrate the coreset performance for the modified lasso and the 1\ell_1 regression with 1\ell_1 regularization.

View on arXiv
Comments on this paper