18
11

Randomly Aggregated Least Squares for Support Recovery

Abstract

We study the problem of exact support recovery: given an (unknown) vector θ{1,0,1}D\theta \in \left\{-1,0,1\right\}^D, we are given access to the noisy measurement y = X\theta + \omega, where XRN×DX \in \mathbb{R}^{N \times D} is a (known) Gaussian matrix and the noise ωRN\omega \in \mathbb{R}^N is an (unknown) Gaussian vector. How small we can choose NN and still reliably recover the support of θ\theta? We present RAWLS (Randomly Aggregated UnWeighted Least Squares Support Recovery): the main idea is to take random subsets of the NN equations, perform a least squares recovery over this reduced bit of information and then average over many random subsets. We show that the proposed procedure can provably recover an approximation of θ\theta and demonstrate its use in support recovery through numerical examples.

View on arXiv
Comments on this paper