41
7

Dense Signals, Linear Estimators, and Out-of-Sample Prediction for High-Dimensional Linear Models

Abstract

Motivated by questions about dense (non-sparse) signals in high-dimensional data analysis, we study the unconditional out-of-sample prediction error (predictive risk) associated with three popular linear estimators for high-dimensional linear models: ridge regression estimators, scalar multiples of the ordinary least squares (OLS) estimator (referred to as James-Stein shrinkage estimators), and marginal regression estimators. The results in this paper require no assumptions about sparsity and imply: (i) if prior information about the population predictor covariance is available, then the ridge estimator outperforms the OLS, James-Stein, and marginal estimators; (ii) if little is known about the population predictor covariance, then the James-Stein estimator may be an effective alternative to the ridge estimator; and (iii) the marginal estimator has serious deficiencies for out-of-sample prediction. Both finite sample and asymptotic properties of the estimators are studied in this paper. Though various asymptotic regimes are considered, we focus on the setting where the number of predictors is roughly proportional to the number of observations. Ultimately, the results presented here provide new and detailed practical guidance regarding several well-known non-sparse methods for high-dimensional linear models.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.