58
0

Fast Debiasing of the LASSO Estimator

Abstract

In high-dimensional sparse regression, the \textsc{Lasso} estimator offers excellent theoretical guarantees but is well-known to produce biased estimates. To address this, \cite{Javanmard2014} introduced a method to ``debias" the \textsc{Lasso} estimates for a random sub-Gaussian sensing matrix A\boldsymbol{A}. Their approach relies on computing an ``approximate inverse" M\boldsymbol{M} of the matrix AA/n\boldsymbol{A}^\top \boldsymbol{A}/n by solving a convex optimization problem. This matrix M\boldsymbol{M} plays a critical role in mitigating bias and allowing for construction of confidence intervals using the debiased \textsc{Lasso} estimates. However the computation of M\boldsymbol{M} is expensive in practice as it requires iterative optimization. In the presented work, we re-parameterize the optimization problem to compute a ``debiasing matrix" W:=AM\boldsymbol{W} := \boldsymbol{AM}^{\top} directly, rather than the approximate inverse M\boldsymbol{M}. This reformulation retains the theoretical guarantees of the debiased \textsc{Lasso} estimates, as they depend on the \emph{product} AM\boldsymbol{AM}^{\top} rather than on M\boldsymbol{M} alone. Notably, we provide a simple, computationally efficient, closed-form solution for W\boldsymbol{W} under similar conditions for the sensing matrix A\boldsymbol{A} used in the original debiasing formulation, with an additional condition that the elements of every row of A\boldsymbol{A} have uncorrelated entries. Also, the optimization problem based on W\boldsymbol{W} guarantees a unique optimal solution, unlike the original formulation based on M\boldsymbol{M}. We verify our main result with numerical simulations.

View on arXiv
@article{banerjee2025_2502.19825,
  title={ Fast Debiasing of the LASSO Estimator },
  author={ Shuvayan Banerjee and James Saunderson and Radhendushka Srivastava and Ajit Rajwade },
  journal={arXiv preprint arXiv:2502.19825},
  year={ 2025 }
}
Comments on this paper