Fast Debiasing of the LASSO Estimator

In high-dimensional sparse regression, the \textsc{Lasso} estimator offers excellent theoretical guarantees but is well-known to produce biased estimates. To address this, \cite{Javanmard2014} introduced a method to ``debias" the \textsc{Lasso} estimates for a random sub-Gaussian sensing matrix . Their approach relies on computing an ``approximate inverse" of the matrix by solving a convex optimization problem. This matrix plays a critical role in mitigating bias and allowing for construction of confidence intervals using the debiased \textsc{Lasso} estimates. However the computation of is expensive in practice as it requires iterative optimization. In the presented work, we re-parameterize the optimization problem to compute a ``debiasing matrix" directly, rather than the approximate inverse . This reformulation retains the theoretical guarantees of the debiased \textsc{Lasso} estimates, as they depend on the \emph{product} rather than on alone. Notably, we provide a simple, computationally efficient, closed-form solution for under similar conditions for the sensing matrix used in the original debiasing formulation, with an additional condition that the elements of every row of have uncorrelated entries. Also, the optimization problem based on guarantees a unique optimal solution, unlike the original formulation based on . We verify our main result with numerical simulations.
View on arXiv@article{banerjee2025_2502.19825, title={ Fast Debiasing of the LASSO Estimator }, author={ Shuvayan Banerjee and James Saunderson and Radhendushka Srivastava and Ajit Rajwade }, journal={arXiv preprint arXiv:2502.19825}, year={ 2025 } }