Laplace Approximation in High-dimensional Bayesian Regression

We consider Bayesian variable selection in sparse high-dimensional regression, where the number of covariates may be large relative to the samples size , but at most a moderate number of covariates are active. Specifically, we treat generalized linear models. For a single fixed sparse model with well-behaved prior distribution, classical theory proves that the Laplace approximation to the marginal likelihood of the model is accurate for sufficiently large sample size . We extend this theory by giving results on uniform accuracy of the Laplace approximation across all models in a high-dimensional scenario in which and , and thus also the number of considered models, may increase with . Moreover, we show how this connection between marginal likelihood and Laplace approximation can be used to obtain consistency results for Bayesian approaches to variable selection in high-dimensional regression.
View on arXiv