20
4

Sample-Efficient Algorithms for Recovering Structured Signals from Magnitude-Only Measurements

Abstract

We consider the problem of recovering a signal xRn\mathbf{x}^* \in \mathbf{R}^n, from magnitude-only measurements yi=ai,xy_i = |\left\langle\mathbf{a}_i,\mathbf{x}^*\right\rangle| for i=[m]i=[m]. Also called the phase retrieval, this is a fundamental challenge in bio-,astronomical imaging and speech processing. The problem above is ill-posed; additional assumptions on the signal and/or the measurements are necessary. In this paper we first study the case where the signal x\mathbf{x}^* is ss-sparse. We develop a novel algorithm that we call Compressive Phase Retrieval with Alternating Minimization, or CoPRAM. Our algorithm is simple; it combines the classical alternating minimization approach for phase retrieval with the CoSaMP algorithm for sparse recovery. Despite its simplicity, we prove that CoPRAM achieves a sample complexity of O(s2logn)O(s^2\log n) with Gaussian measurements ai\mathbf{a}_i, matching the best known existing results; moreover, it demonstrates linear convergence in theory and practice. Additionally, it requires no extra tuning parameters other than signal sparsity ss and is robust to noise. When the sorted coefficients of the sparse signal exhibit a power law decay, we show that CoPRAM achieves a sample complexity of O(slogn)O(s\log n), which is close to the information-theoretic limit. We also consider the case where the signal x\mathbf{x}^* arises from structured sparsity models. We specifically examine the case of block-sparse signals with uniform block size of bb and block sparsity k=s/bk=s/b. For this problem, we design a recovery algorithm Block CoPRAM that further reduces the sample complexity to O(kslogn)O(ks\log n). For sufficiently large block lengths of b=Θ(s)b=\Theta(s), this bound equates to O(slogn)O(s\log n). To our knowledge, this constitutes the first end-to-end algorithm for phase retrieval where the Gaussian sample complexity has a sub-quadratic dependence on the signal sparsity level.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.