ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1504.06122
25
45

Random projections for Bayesian regression

23 April 2015
Leo N. Geppert
K. Ickstadt
Alexander Munteanu
Jens Quedenfeld
C. Sohler
ArXivPDFHTML
Abstract

This article deals with random projections applied as a data reduction technique for Bayesian regression analysis. We show sufficient conditions under which the entire ddd-dimensional distribution is approximately preserved under random projections by reducing the number of data points from nnn to k∈O(poly⁡(d/ε))k\in O(\operatorname{poly}(d/\varepsilon))k∈O(poly(d/ε)) in the case n≫dn\gg dn≫d. Under mild assumptions, we prove that evaluating a Gaussian likelihood function based on the projected data instead of the original data yields a (1+O(ε))(1+O(\varepsilon))(1+O(ε))-approximation in terms of the ℓ2\ell_2ℓ2​ Wasserstein distance. Our main result shows that the posterior distribution of Bayesian linear regression is approximated up to a small error depending on only an ε\varepsilonε-fraction of its defining parameters. This holds when using arbitrary Gaussian priors or the degenerate case of uniform distributions over Rd\mathbb{R}^dRd for β\betaβ. Our empirical evaluations involve different simulated settings of Bayesian linear regression. Our experiments underline that the proposed method is able to recover the regression model up to small error while considerably reducing the total running time.

View on arXiv
Comments on this paper