ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.05004
37
0

Stacking Variational Bayesian Monte Carlo

7 April 2025
Francesco Silvestrin
Chengkun Li
Luigi Acerbi
    BDL
ArXivPDFHTML
Abstract

Variational Bayesian Monte Carlo (VBMC) is a sample-efficient method for approximate Bayesian inference with computationally expensive likelihoods. While VBMC's local surrogate approach provides stable approximations, its conservative exploration strategy and limited evaluation budget can cause it to miss regions of complex posteriors. In this work, we introduce Stacking Variational Bayesian Monte Carlo (S-VBMC), a method that constructs global posterior approximations by merging independent VBMC runs through a principled and inexpensive post-processing step. Our approach leverages VBMC's mixture posterior representation and per-component evidence estimates, requiring no additional likelihood evaluations while being naturally parallelizable. We demonstrate S-VBMC's effectiveness on two synthetic problems designed to challenge VBMC's exploration capabilities and two real-world applications from computational neuroscience, showing substantial improvements in posterior approximation quality across all cases.

View on arXiv
@article{silvestrin2025_2504.05004,
  title={ Stacking Variational Bayesian Monte Carlo },
  author={ Francesco Silvestrin and Chengkun Li and Luigi Acerbi },
  journal={arXiv preprint arXiv:2504.05004},
  year={ 2025 }
}
Comments on this paper