70
145

Honest variable selection in linear and logistic regression models via 1\ell_1 and 1+2\ell_1+\ell_2 penalization

Abstract

This paper investigates correct variable selection in finite samples via 1\ell_1 and 1+2\ell_1+\ell_2 type penalization schemes. The asymptotic consistency of variable selection immediately follows from this analysis. We focus on logistic and linear regression models. The following questions are central to our paper: given a level of confidence 1δ1-\delta, under which assumptions on the design matrix, for which strength of the signal and for what values of the tuning parameters can we identify the true model at the given level of confidence? Formally, if I^\widehat{I} is an estimate of the true variable set II^*, we study conditions under which P(I^=I)1δ\mathbb{P}(\widehat{I}=I^*)\geq 1-\delta, for a given sample size nn, number of parameters MM and confidence 1δ1-\delta. We show that in identifiable models, both methods can recover coefficients of size 1n\frac{1}{\sqrt{n}}, up to small multiplicative constants and logarithmic factors in MM and 1δ\frac{1}{\delta}. The advantage of the 1+2\ell_1+\ell_2 penalization over the 1\ell_1 is minor for the variable selection problem, for the models we consider here. Whereas the former estimates are unique, and become more stable for highly correlated data matrices as one increases the tuning parameter of the 2\ell_2 part, too large an increase in this parameter value may preclude variable selection.

View on arXiv
Comments on this paper