ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1309.5979
30
20

Asymptotic Analysis of LASSOs Solution Path with Implications for Approximate Message Passing

23 September 2013
Ali Mousavi
A. Maleki
Richard G. Baraniuk
ArXivPDFHTML
Abstract

This paper concerns the performance of the LASSO (also knows as basis pursuit denoising) for recovering sparse signals from undersampled, randomized, noisy measurements. We consider the recovery of the signal xo∈RNx_o \in \mathbb{R}^Nxo​∈RN from nnn random and noisy linear observations y=Axo+wy= Ax_o + wy=Axo​+w, where AAA is the measurement matrix and www is the noise. The LASSO estimate is given by the solution to the optimization problem xox_oxo​ with x^λ=arg⁡min⁡x12∥y−Ax∥22+λ∥x∥1\hat{x}_{\lambda} = \arg \min_x \frac{1}{2} \|y-Ax\|_2^2 + \lambda \|x\|_1x^λ​=argminx​21​∥y−Ax∥22​+λ∥x∥1​. Despite major progress in the theoretical analysis of the LASSO solution, little is known about its behavior as a function of the regularization parameter λ\lambdaλ. In this paper we study two questions in the asymptotic setting (i.e., where N→∞N \rightarrow \inftyN→∞, n→∞n \rightarrow \inftyn→∞ while the ratio n/Nn/Nn/N converges to a fixed number in (0,1)(0,1)(0,1)): (i) How does the size of the active set ∥x^λ∥0/N\|\hat{x}_\lambda\|_0/N∥x^λ​∥0​/N behave as a function of λ\lambdaλ, and (ii) How does the mean square error ∥x^λ−xo∥22/N\|\hat{x}_{\lambda} - x_o\|_2^2/N∥x^λ​−xo​∥22​/N behave as a function of λ\lambdaλ? We then employ these results in a new, reliable algorithm for solving LASSO based on approximate message passing (AMP).

View on arXiv
Comments on this paper