17
1

Small Tuning Parameter Selection for the Debiased Lasso

Abstract

In this study, we investigate the bias and variance properties of the debiased Lasso in linear regression when the tuning parameter of the node-wise Lasso is selected to be smaller than in previous studies. We consider the case where the number of covariates pp is bounded by a constant multiple of the sample size nn. First, we show that the bias of the debiased Lasso can be reduced without diverging the asymptotic variance by setting the order of the tuning parameter to 1/n1/\sqrt{n}.This implies that the debiased Lasso has asymptotic normality provided that the number of nonzero coefficients s0s_0 satisfies s0=o(n/logp)s_0=o(\sqrt{n/\log p}), whereas previous studies require s0=o(n/logp)s_0 =o(\sqrt{n}/\log p) if no sparsity assumption is imposed on the precision matrix. Second, we propose a data-driven tuning parameter selection procedure for the node-wise Lasso that is consistent with our theoretical results. Simulation studies show that our procedure yields confidence intervals with good coverage properties in various settings. We also present a real economic data example to demonstrate the efficacy of our selection procedure.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.