ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.01212
12
92

The distribution of the Lasso: Uniform control over sparse balls and adaptive parameter tuning

3 November 2018
Léo Miolane
Andrea Montanari
ArXivPDFHTML
Abstract

The Lasso is a popular regression method for high-dimensional problems in which the number of parameters θ1,…,θN\theta_1,\dots,\theta_Nθ1​,…,θN​, is larger than the number nnn of samples: N>nN>nN>n. A useful heuristics relates the statistical properties of the Lasso estimator to that of a simple soft-thresholding denoiser,in a denoising problem in which the parameters (θi)i≤N(\theta_i)_{i\le N}(θi​)i≤N​ are observed in Gaussian noise, with a carefully tuned variance. Earlier work confirmed this picture in the limit n,N→∞n,N\to\inftyn,N→∞, pointwise in the parameters θ\thetaθ, and in the value of the regularization parameter. Here, we consider a standard random design model and prove exponential concentration of its empirical distribution around the prediction provided by the Gaussian denoising model. Crucially, our results are uniform with respect to θ\thetaθ belonging to ℓq\ell_qℓq​ balls, q∈[0,1]q\in [0,1]q∈[0,1], and with respect to the regularization parameter. This allows to derive sharp results for the performances of various data-driven procedures to tune the regularization. Our proofs make use of Gaussian comparison inequalities, and in particular of a version of Gordon's minimax theorem developed by Thrampoulidis, Oymak, and Hassibi, which controls the optimum value of the Lasso optimization problem. Crucially, we prove a stability property of the minimizer in Wasserstein distance, that allows to characterize properties of the minimizer itself.

View on arXiv
Comments on this paper