This study aims at contributing to lower bounds for empirical compatibility constants or empirical restricted eigenvalues. This is of importance in compressed sensing and theory for -regularized estimators. Let be an data matrix with rows being independent copies of a -dimensional random variable. Let be the inner product matrix. We show that the quadratic forms are lower bounded by a value converging to one, uniformly over the set of vectors with equal to one and -norm at most . Here is the theoretical inner product matrix which we assume to exist. The constant is required to be of small order . We assume moreover -th order isotropy for some and sub-exponential tails or moments up to order for the entries in . As a consequence we obtain convergence of the empirical compatibility constant to its theoretical counterpart, and similarly for the empirical restricted eigenvalue. If the data matrix is first normalized so that its columns all have equal length we obtain lower bounds assuming only isotropy and no further moment conditions on its entries. The isotropy condition is shown to hold for certain martingale situations.
View on arXiv