General framework for projection structures

In the first part, we develop a general framework for projection structures and study several inference problems within this framework. We propose procedures based on data dependent measures (DDM) and make connections with empirical Bayes and penalization methods. The main inference problem is the uncertainty quantification (UQ), but on the way we solve the estimation, DDM-contraction problems, and a weak version of the structure recovery problem. The approach is local in that the quality of the inference procedures is measured by the local quantity, the oracle rate, which is the best trade-off between the approximation error by a projection structure and the complexity of that approximating projection structure. Like in statistical learning settings, we develop distribution-free theory as no particular model is imposed, we only assume certain mild condition on the stochastic part of the projection predictor. We introduce the excessive bias restriction (EBR) under which we establish the local confidence optimality of the constructed confidence ball. The proposed general framework unifies a very broad class of high-dimensional models and structures, interesting and important on their own right. In the second part, we apply the developed theory and demonstrate how the general results deliver a whole avenue of local and global minimax results (many new ones, some known results from the literature are improved) for particular models and structures as consequences, including white noise model and density estimation with smoothness structure, linear regression and dictionary learning with sparsity structures, biclustering and stochastic block models with clustering structure, covariance matrix estimation with banding and sparsity structures, and many others. Various adaptive minimax results over various scales follow also from our local results.
View on arXiv