287
171

Information-theoretic limits on sparse signal recovery: Dense versus sparse measurement matrices

Abstract

We study the information-theoretic limits of exactly recovering the support of a sparse signal using noisy projections defined by various classes of measurement matrices. Our analysis is high-dimensional in nature, in which the number of observations nn, the ambient signal dimension pp, and the signal sparsity kk are all allowed to tend to infinity in a general manner. This paper makes two novel contributions. First, we provide sharper necessary conditions for exact support recovery using general (non-Gaussian) dense measurement matrices. Combined with previously known sufficient conditions, this result yields sharp characterizations of when the optimal decoder can recover a signal for various scalings of the sparsity kk and sample size nn, including the important special case of linear sparsity (k=Θ(p)k = \Theta(p)) using a linear scaling of observations (n=Θ(p)n = \Theta(p)). Our second contribution is to prove necessary conditions on the number of observations nn required for asymptotically reliable recovery using a class of γ\gamma-sparsified measurement matrices, where the measurement sparsity γ(n,p,k)(0,1]\gamma(n, p, k) \in (0,1] corresponds to the fraction of non-zero entries per row. Our analysis allows general scaling of the quadruplet (n,p,k,γ)(n, p, k, \gamma), and reveals three different regimes, corresponding to whether measurement sparsity has no effect, a minor effect, or a dramatic effect on the information-theoretic limits of the subset recovery problem.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.