This article deals with random projections applied as a data reduction technique for Bayesian regression analysis. We show sufficient conditions under which the entire -dimensional distribution is approximately preserved under random projections by reducing the number of data points from to in the case . Under mild assumptions, we prove that evaluating a Gaussian likelihood function based on the projected data instead of the original data yields a -approximation in terms of the Wasserstein distance. Our main result shows that the posterior distribution of Bayesian linear regression is approximated up to a small error depending on only an -fraction of its defining parameters. This holds when using arbitrary Gaussian priors or the degenerate case of uniform distributions over for . Our empirical evaluations involve different simulated settings of Bayesian linear regression. Our experiments underline that the proposed method is able to recover the regression model up to small error while considerably reducing the total running time.
View on arXiv