70
1

Error Prediction and Variable Selection via Unbalanced Expander Graphs

Abstract

This article investigates deterministic design matrices XX for the fundamental problems of error prediction and variable selection given observations y=Xβ+zy=X \beta^\star+z where zz is a stochastic error term. In this paper, deterministic design matrices are derived from unbalanced expander graphs, and we show that it is possible to accurately estimate the prediction XβX\beta^\star and the target vector β\beta^\star using computationally tractable algorithms. Using a result of Berinde, Gilbert, Indyk, Karloff and Strauss, we show that for any adjacency matrix of an unbalanced expander graph and any target vector β\beta^\star, the lasso (1\ell_1-penalized least squares) and the Dantzig selector (\ell_\infty-penalized basis pursuit) satisfy oracle inequalities in error prediction and variable selection involving the ss largest (in magnitude) coefficients of β\beta^\star, i.e. upper bounds in term of the best sparse approximation. Using recent results on Parvaresh-Vardy codes, we present a construction of deterministic designs. Furthermore, we prove that these designs are almost optimal. Indeed, they provide error prediction and variable selection with an accuracy which is the best, up to an explicit factor, one could expect knowing the support of the target β\beta^\star.

View on arXiv
Comments on this paper