Asymptotically Exact Error Analysis for the Generalized -LASSO

Given an unknown signal and linear noisy measurements , the generalized -LASSO solves . Here, is a convex regularization function (e.g. -norm, nuclear-norm) aiming to promote the structure of (e.g. sparse, low-rank), and, is the regularizer parameter. A related optimization problem, though not as popular or well-known, is often referred to as the generalized -LASSO and takes the form , and has been analyzed in [1]. [1] further made conjectures about the performance of the generalized -LASSO. This paper establishes these conjectures rigorously. We measure performance with the normalized squared error . Assuming the entries of and be i.i.d. standard normal, we precisely characterize the "asymptotic NSE" when the problem dimensions tend to infinity in a proportional manner. The role of and is explicitly captured in the derived expression via means of a single geometric quantity, the Gaussian distance to the subdifferential. We conjecture that . We include detailed discussions on the interpretation of our result, make connections to relevant literature and perform computational experiments that validate our theoretical findings.
View on arXiv