ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1206.5208
44
25

Approximate Bayesian Computation for Smoothing

22 June 2012
James S. Martin
Ajay Jasra
Sumeetpal S. Singh
N. Whiteley
E. McCoy
ArXivPDFHTML
Abstract

We consider a method for approximate inference in hidden Markov models (HMMs). The method circumvents the need to evaluate conditional densities of observations given the hidden states. It may be considered an instance of Approximate Bayesian Computation (ABC) and it involves the introduction of auxiliary variables valued in the same space as the observations. The quality of the approximation may be controlled to arbitrary precision through a parameter \epsilon>0 . We provide theoretical results which quantify, in terms of \epsilon, the ABC error in approximation of expectations of additive functionals with respect to the smoothing distributions. Under regularity assumptions, this error is O(n\epsilon), where n is the number of time steps over which smoothing is performed. For numerical implementation we adopt the forward-only sequential Monte Carlo (SMC) scheme of [16] and quantify the combined error from the ABC and SMC approximations. This forms some of the first quantitative results for ABC methods which jointly treat the ABC and simulation errors, with a finite number of data and simulated samples. When the HMM has unknown static parameters, we consider particle Markov chain Monte Carlo [2] (PMCMC) methods for batch statistical inference.

View on arXiv
Comments on this paper