56
155

Robust Lasso with missing and grossly corrupted observations

Abstract

This paper studies the problem of accurately recovering a sparse vector β\beta^{\star} from highly corrupted linear measurements y=Xβ+e+wy = X \beta^{\star} + e^{\star} + w where ee^{\star} is a sparse error vector whose nonzero entries may be unbounded and ww is a bounded noise. We propose a so-called extended Lasso optimization which takes into consideration sparse prior information of both β\beta^{\star} and ee^{\star}. Our first result shows that the extended Lasso can faithfully recover both the regression as well as the corruption vector. Our analysis relies on the notion of extended restricted eigenvalue for the design matrix XX. Our second set of results applies to a general class of Gaussian design matrix XX with i.i.d rows \operN(0,Σ)\oper N(0, \Sigma), for which we can establish a surprising result: the extended Lasso can recover exact signed supports of both β\beta^{\star} and ee^{\star} from only Ω(klogplogn)\Omega(k \log p \log n) observations, even when the fraction of corruption is arbitrarily close to one. Our analysis also shows that this amount of observations required to achieve exact signed support is indeed optimal.

View on arXiv
Comments on this paper