Recent results have proven the minimax optimality of LASSO and related algorithms for noisy linear regression. However, these results tend to rely on variance estimators that are inefficient or optimizations that are slower than LASSO itself. We propose an efficient estimator for the noise variance in high dimensional linear regression that is faster than LASSO, only requiring matrix-vector multiplications. We prove this estimator is consistent with a good rate of convergence, under the condition that the design matrix satisfies the Restricted Isometry Property (RIP). In practice, our estimator scales incredibly well into high dimensions, is highly parallelizable, and only incurs a modest bias.
View on arXiv