We consider the problem of estimating an unknown signal from noisy linear observations . In many practical instances, has a certain structure that can be captured by a structure inducing convex function . For example, norm can be used to encourage a sparse solution. To estimate with the aid of , we consider the well-known LASSO method and provide sharp characterization of its performance. We assume the entries of the measurement matrix and the noise vector have zero-mean normal distributions with variances and respectively. For the LASSO estimator , we attempt to calculate the Normalized Square Error (NSE) defined as as a function of the noise level , the number of observations and the structure of the signal. We show that, the structure of the signal and choice of the function enter the error formulae through the summary parameters and , which are defined as the Gaussian squared-distances to the subdifferential cone and to the -scaled subdifferential, respectively. The first LASSO estimator assumes a-priori knowledge of and is given by . We prove that its worst case NSE is achieved when and concentrates around . Secondly, we consider , for some . This time the NSE formula depends on the choice of and is given by . We then establish a mapping between this and the third estimator . Finally, for a number of important structured signal classes, we translate our abstract formulae to closed-form upper bounds on the NSE.
View on arXiv