Error Prediction and Model Selection via Unbalanced Expander Graphs

We investigate deterministic design matrices for the fundamental problems of error prediction and model selection. Our deterministic design matrices are constructed from unbalanced expander graphs, and we wonder if it is possible to accurately estimate the response and the support of our target vector using computationally tractable algorithms. We show that for any adjacency matrix of an unbalanced expander graph and any target vector, the lasso and the Dantzig selector satisfy oracle inequalities in error prediction and model selection involving the largest (in magnitude) coefficients of the target, i.e. upper bounds in term of the best sparse approximation. Our oracle inequalities allow error prediction with an accuracy which is the best, up to a logarithmic factor, one could expect knowing the support of the target. From a practical standpoint, these estimators can be computed by solving, either a simple quadratic program for the lasso, or a linear program for the Dantzig selector. Our results are non-asymptotic and describe the performance one can expect in all cases.
View on arXiv