We study a high-dimensional regression model. Aim is to construct a confidence set for a given group of regression coefficients, treating all other regression coefficients as nuisance parameters. We apply a one-step procedure with the square-root Lasso as initial estimator and a multivariate square-root Lasso for constructing a surrogate Fisher information matrix. The multivariate square-root Lasso is based on nuclear norm loss with -penalty. We show that this procedure leads to an asymptotically -distributed pivot, with a remainder term depending only on the -error of the initial estimator. We show that under -sparsity conditions on the regression coefficients the square-root Lasso produces to a consistent estimator of the noise variance and we establish sharp oracle inequalities which show that the remainder term is small under further sparsity conditions on and compatibility conditions on the design.
View on arXiv