ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2212.02658
14
1

Better Together: pooling information in likelihood-free inference

5 December 2022
David T. Frazier
Christopher C. Drovandi
David J. Nott
ArXivPDFHTML
Abstract

Likelihood-free inference (LFI) methods, such as Approximate Bayesian computation (ABC), are now routinely applied to conduct inference in complex models. While the application of LFI is now commonplace, the choice of which summary statistics to use in the construction of the posterior remains an open question that is fraught with both practical and theoretical challenges. Instead of choosing a single vector of summaries on which to base inference, we suggest a new pooled posterior and show how to optimally combine inferences from different LFI posteriors. This pooled approach to inference obviates the need to choose a single vector of summaries, or even a single LFI algorithm, and delivers guaranteed inferential accuracy without requiring the computational resources associated with sampling LFI posteriors in high-dimensions. We illustrate this approach through a series of benchmark examples considered in the LFI literature.

View on arXiv
Comments on this paper