ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.11536
68
2

RobMOT: Robust 3D Multi-Object Tracking by Observational Noise and State Estimation Drift Mitigation on LiDAR PointCloud

19 May 2024
Mohamed Nagy
N. Werghi
Bilal Hassan
Jorge Dias
Majid Khonji
    VOT
ArXivPDFHTML
Abstract

This paper addresses limitations in 3D tracking-by-detection methods, particularly in identifying legitimate trajectories and reducing state estimation drift in Kalman filters. Existing methods often use threshold-based filtering for detection scores, which can fail for distant and occluded objects, leading to false positives. To tackle this, we propose a novel track validity mechanism and multi-stage observational gating process, significantly reducing ghost tracks and enhancing tracking performance. Our method achieves a 29.47%29.47\%29.47% improvement in Multi-Object Tracking Accuracy (MOTA) on the KITTI validation dataset with the Second detector. Additionally, a refined Kalman filter term reduces localization noise, improving higher-order tracking accuracy (HOTA) by 4.8%4.8\%4.8%. The online framework, RobMOT, outperforms state-of-the-art methods across multiple detectors, with HOTA improvements of up to 3.92%3.92\%3.92% on the KITTI testing dataset and 8.7%8.7\%8.7% on the validation dataset, while achieving low identity switch scores. RobMOT excels in challenging scenarios, tracking distant objects and prolonged occlusions, with a 1.77%1.77\%1.77% MOTA improvement on the Waymo Open dataset, and operates at a remarkable 3221 FPS on a single CPU, proving its efficiency for real-time multi-object tracking.

View on arXiv
@article{nagy2025_2405.11536,
  title={ RobMOT: Robust 3D Multi-Object Tracking by Observational Noise and State Estimation Drift Mitigation on LiDAR PointCloud },
  author={ Mohamed Nagy and Naoufel Werghi and Bilal Hassan and Jorge Dias and Majid Khonji },
  journal={arXiv preprint arXiv:2405.11536},
  year={ 2025 }
}
Comments on this paper