The computational complexity of simultaneous inference methods in high-dimensional linear regression models quickly increases with the number variables. This paper proposes a computationally efficient method based on the Moore-Penrose pseudoinverse. Under a symmetry assumption on the available regressors, the estimators are normally distributed and accompanied by a closed-form expression for the standard errors that is free of tuning parameters. We study the numerical performance in Monte Carlo experiments that mimic the size of modern applications for which existing methods are computationally infeasible. We find close to nominal coverage, even in settings where the imposed symmetry assumption does not hold. Regularization of the pseudoinverse via a ridge adjustment is shown to yield possible efficiency gains.
View on arXiv