Thresholded Basis Pursuit: Quantizing Linear Programming Solutions for
Optimal Support Recovery and Approximation in Compressed Sensing
We consider the classical Compressed Sensing problem. We have a large under-determined set of noisy measurements Y=GX+N, where X is a sparse signal and G is drawn from a random ensemble. In this paper we focus on a quantized linear programming solution for support recovery. Our solution of the problem amounts to solving , and quantizing/thresholding the resulting solution . We show that this scheme is guaranteed to perfectly reconstruct a discrete signal or control the element-wise reconstruction error for a continuous signal for specific values of sparsity. We show that in the linear regime when the sparsity, , increases linearly with signal dimension, , the sign pattern of can be recovered with and measurements. Our proof technique is based on perturbation of the noiseless problem. Consequently, the achievable sparsity level in the noisy problem is comparable to that of the noiseless problem. Our result offers a sharp characterization in that neither the nor the sparsity ratio can be significantly improved. In contrast previous results based on LASSO and MAX-Correlation techniques assume significantly larger or sub-linear sparsity. We also show that our final result can be obtained from Dvoretsky theorem rather than the restricted isometry property (RIP). The advantage of this line of reasoning is that Dvoretsky's theorem continues to hold for non-singular transformations while RIP property may not be satisfied for the latter case.
View on arXiv