ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.06173
20
2

SMC Is All You Need: Parallel Strong Scaling

9 February 2024
Xin Liang
J. Lukens
Sanjaya Lohani
Brian T. Kirby
T. Searles
Kody J. H. Law
ArXivPDFHTML
Abstract

The Bayesian posterior distribution can only be evaluated up-to a constant of proportionality, which makes simulation and consistent estimation challenging. Classical consistent Bayesian methods such as sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC) have unbounded time complexity requirements. We develop a fully parallel sequential Monte Carlo (pSMC) method which provably delivers parallel strong scaling, i.e. the time complexity (and per-node memory) remains bounded if the number of asynchronous processes is allowed to grow. More precisely, the pSMC has a theoretical convergence rate of Mean Square Error (MSE)=O(1/NP) = O(1/NP)=O(1/NP), where NNN denotes the number of communicating samples in each processor and PPP denotes the number of processors. In particular, for suitably-large problem-dependent NNN, as P→∞P \rightarrow \inftyP→∞ the method converges to infinitesimal accuracy MSE=O(ε2)=O(\varepsilon^2)=O(ε2) with a fixed finite time-complexity Cost=O(1)=O(1)=O(1) and with no efficiency leakage, i.e. computational complexity Cost=O(ε−2)=O(\varepsilon^{-2})=O(ε−2). A number of Bayesian inference problems are taken into consideration to compare the pSMC and MCMC methods.

View on arXiv
Comments on this paper