ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.16041
68
14
v1v2 (latest)

Linear Complexity Gibbs Sampling for Generalized Labeled Multi-Bernoulli Filtering

29 November 2022
Changbeom Shim
B. Vo
B. Vo
Jonah Ong
Diluka Moratuwage
ArXiv (abs)PDFHTML
Abstract

Generalized Labeled Multi-Bernoulli (GLMB) densities arise in a host of multi-object system applications analogous to Gaussians in single-object filtering. However, computing the GLMB filtering density requires solving NP-hard problems. To alleviate this computational bottleneck, we develop a linear complexity Gibbs sampling framework for GLMB density computation. Specifically, we propose a tempered Gibbs sampler that exploits the structure of the GLMB filtering density to achieve an O(T(P+M))\mathcal{O}(T(P+M))O(T(P+M)) complexity, where TTT is the number of iterations of the algorithm, PPP and MMM are the number hypothesized objects and measurements. This innovation enables the GLMB filter implementation to be reduced from an O(TP2M)\mathcal{O}(TP^{2}M)O(TP2M) complexity to O(T(P+M+log⁡T)+PM)\mathcal{O}(T(P+M+\log T)+PM)O(T(P+M+logT)+PM). Moreover, the proposed framework provides the flexibility for trade-offs between tracking performance and computational load. Convergence of the proposed Gibbs sampler is established, and numerical studies are presented to validate the proposed GLMB filter implementation.

View on arXiv
Comments on this paper