ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.01383
17
1

On the performance of particle filters with adaptive number of particles

4 November 2019
Victor Elvira
Joaquín Míguez
Petar M. Djurić
ArXivPDFHTML
Abstract

We investigate the performance of a class of particle filters (PFs) that can automatically tune their computational complexity by evaluating online certain predictive statistics which are invariant for a broad class of state-space models. To be specific, we propose a family of block-adaptive PFs based on the methodology of Elvira et al (2017). In this class of algorithms, the number of Monte Carlo samples (known as particles) is adjusted periodically, and we prove that the theoretical error bounds of the PF actually adapt to the updates in the number of particles. The evaluation of the predictive statistics that lies at the core of the methodology is done by generating fictitious observations, i.e., particles in the observation space. We study, both analytically and numerically, the impact of the number KKK of these particles on the performance of the algorithm. In particular, we prove that if the predictive statistics with KKK fictitious observations converged exactly, then the particle approximation of the filtering distribution would match the first KKK elements in a series of moments of the true filter. This result can be understood as a converse to some convergence theorems for PFs. From this analysis, we deduce an alternative predictive statistic that can be computed (for some models) without sampling any fictitious observations at all. Finally, we conduct an extensive simulation study that illustrates the theoretical results and provides further insights into the complexity, performance and behavior of the new class of algorithms.

View on arXiv
Comments on this paper