ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1405.4081
86
52
v1v2 (latest)

Sequential Monte Carlo with Highly Informative Observations

16 May 2014
P. Del Moral
Lawrence M. Murray
ArXiv (abs)PDFHTML
Abstract

We propose sequential Monte Carlo (SMC) methods for sampling the posterior distribution of state-space models under highly informative observation regimes, a situation in which standard SMC methods can perform poorly. A special case is simulating bridges between given initial and final values. The basic idea is to introduce a schedule of intermediate weighting and resampling times between observation times, which guide particles towards the final state. This can always be done for continuous-time models, and may be done for discrete-time models under sparse observation regimes. The methods support multivariate models with partial observation, do not require simulation of the backward state process, and, where possible, avoid point-wise evaluation of the forward transition density. When simulating bridges, the last cannot be avoided entirely without concessions, and we suggest an ϵ\epsilonϵ-ball approach (reminiscent of Approximate Bayesian Computation) as a workaround. Compared to the bootstrap particle filter, the new methods deliver substantially reduced mean squared error in normalising constant estimates, even after accounting for execution time. The methods are demonstrated for state estimation with two toy examples, and for parameter estimation (within a particle marginal Metropolis--Hastings sampler) with three applied examples in econometrics, epidemiology and marine biogeochemistry.

View on arXiv
Comments on this paper