21
4

Beyond Independent Measurements: General Compressed Sensing with GNN Application

Abstract

We consider the problem of recovering a structured signal xRn\mathbf{x} \in \mathbb{R}^{n} from noisy linear observations y=Mx+w\mathbf{y} =\mathbf{M} \mathbf{x}+\mathbf{w}. The measurement matrix is modeled as M=BA\mathbf{M} = \mathbf{B}\mathbf{A}, where BRl×m\mathbf{B} \in \mathbb{R}^{l \times m} is arbitrary and ARm×n\mathbf{A} \in \mathbb{R}^{m \times n} has independent sub-gaussian rows. By varying B\mathbf{B}, and the sub-gaussian distribution of A\mathbf{A}, this gives a family of measurement matrices which may have heavy tails, dependent rows and columns, and singular values with a large dynamic range. When the structure is given as a possibly non-convex cone TRnT \subset \mathbb{R}^{n}, an approximate empirical risk minimizer is proven to be a robust estimator if the effective number of measurements is sufficient, even in the presence of a model mismatch. In classical compressed sensing with independent (sub-)gaussian measurements, one asks how many measurements are needed to recover x\mathbf{x}? In our setting, however, the effective number of measurements depends on the properties of B\mathbf{B}. We show that the effective rank of B\mathbf{B} may be used as a surrogate for the number of measurements, and if this exceeds the squared Gaussian mean width of (TT)Sn1(T-T) \cap \mathbb{S}^{n-1}, then accurate recovery is guaranteed. Furthermore, we examine the special case of generative priors in detail, that is when x\mathbf{x} lies close to T=ran(G)T = \mathrm{ran}(G) and G:RkRnG: \mathbb{R}^k \rightarrow \mathbb{R}^n is a Generative Neural Network (GNN) with ReLU activation functions. Our work relies on a recent result in random matrix theory by Jeong, Li, Plan, and Yilmaz arXiv:2001.10631. .

View on arXiv
Comments on this paper