SURE Information Criteria for Large Covariance Matrix Estimation and Their Asymptotic Properties

Consider independent and identically distributed -dimensional Gaussian random vectors with covariance matrix The problem of estimating when is much larger than has received a lot of attention in recent years. Yet little is known about the information criterion for covariance matrix estimation. How to properly define such a criterion and what are the statistical properties? We attempt to answer these questions in the present paper by focusing on the estimation of bandable covariance matrices when but . Motivated by the deep connection between Stein's unbiased risk estimation (SURE) and AIC in regression models, we propose a family of generalized SURE () indexed by for covariance matrix estimation, where is some constant. When is 2, provides an unbiased estimator of the Frobenious risk of the covariance matrix estimator. Furthermore, we show that by minimizing over all possible banding covariance matrix estimators we attain the minimax optimal rate of convergence and the resulting estimator behaves like the covariance matrix estimator obtained by the so-called oracle tuning. On the other hand, we also show that is selection inconsistent when the true covariance matrix is exactly banded. To fix the selection inconsistency, we consider using SURE with and prove that by minimizing we select the true bandwith with probability tending to one. Therefore, our analysis indicates that and can be regarded as the AIC and BIC for large covariance matrix estimation, respectively.
View on arXiv