Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection

We focus on the high-dimensional linear regression problem, where the algorithmic goal is to efficiently infer an unknown feature vector from its linear measurements, using a small number of samples. Unlike most of the literature, we make no sparsity assumption on , but instead adopt a different regularization: In the noiseless setting, we assume consists of entries, which are either rational numbers with a common denominator (referred to as -rationality); or irrational numbers supported on a rationally independent set of bounded cardinality, known to learner; collectively called as the mixed-support assumption. Using a novel combination of the PSLQ integer relation detection, and LLL lattice basis reduction algorithms, we propose a polynomial-time algorithm which provably recovers a enjoying the mixed-support assumption, from its linear measurements for a large class of distributions for the random entries of , even with one measurement . In the noisy setting, we propose a polynomial-time, lattice-based algorithm, which recovers a enjoying -rationality, from its noisy measurements , even with a single sample . We further establish for large , and normal noise, this algorithm tolerates information-theoretically optimal level of noise. We then apply these ideas to develop a polynomial-time, single-sample algorithm for the phase retrieval problem. Our methods address the single-sample regime, where the sparsity-based methods such as LASSO and Basis Pursuit are known to fail. Furthermore, our results also reveal an algorithmic connection between the high-dimensional linear regression problem, and the integer relation detection, randomized subset-sum, and shortest vector problems.
View on arXiv