13
0

Heavy Lasso: sparse penalized regression under heavy-tailed noise via data-augmented soft-thresholding

Main:16 Pages
1 Figures
Bibliography:2 Pages
5 Tables
Abstract

High-dimensional linear regression is a fundamental tool in modern statistics, particularly when the number of predictors exceeds the sample size. The classical Lasso, which relies on the squared loss, performs well under Gaussian noise assumptions but often deteriorates in the presence of heavy-tailed errors or outliers commonly encountered in real data applications such as genomics, finance, and signal processing. To address these challenges, we propose a novel robust regression method, termed Heavy Lasso, which incorporates a loss function inspired by the Student's t-distribution within a Lasso penalization framework. This loss retains the desirable quadratic behavior for small residuals while adaptively downweighting large deviations, thus enhancing robustness to heavy-tailed noise and outliers. Heavy Lasso enjoys computationally efficient by leveraging a data augmentation scheme and a soft-thresholding algorithm, which integrate seamlessly with classical Lasso solvers. Theoretically, we establish non-asymptotic bounds under both 1\ell_1 and 2\ell_2 norms, by employing the framework of localized convexity, showing that the Heavy Lasso estimator achieves rates comparable to those of the Huber loss. Extensive numerical studies demonstrate Heavy Lasso's superior performance over classical Lasso and other robust variants, highlighting its effectiveness in challenging noisy settings. Our method is implemented in the R package heavylasso available on Github.

View on arXiv
@article{mai2025_2506.07790,
  title={ Heavy Lasso: sparse penalized regression under heavy-tailed noise via data-augmented soft-thresholding },
  author={ Tien Mai },
  journal={arXiv preprint arXiv:2506.07790},
  year={ 2025 }
}
Comments on this paper