212

EndoSensorFusion: Particle Filtering-Based Multi-sensory Data Fusion with Switching State-Space Model for Endoscopic Capsule Robots using Recurrent Neural Network Kinematics

Abstract

A reliable, real time multi-sensor fusion functionality is crucial for the localization of actively controlled next-generation endoscopic capsule robots, as an emerging minimally invasive diagnostic technology for the inspection of GI tract and diagnosis of a wide range of diseases and pathologies. In this study, we propose a novel multi-sensor fusion approach based on switching observations model using non-linear kinematics learned by recurrent neural networks for real-time endoscopic capsule robot localization. Our method concerns the sequential estimation of a hidden state vector from noisy pose observations delivered by multiple sensors, e.g. 5 degree-of-freedom (5-DoF) absolute pose estimation by magnetic 2D Hall effect sensor array and 6-DoF relative pose estimation by a visual odometry approach. In addition, the proposed method is capable of detecting and handling sensor failures inbetween of nominal sensor states. Detailed analyses and evaluations made on a real pig stomach dataset proves that our system achieves high translational and rotational accuracies for different types of endoscopic capsule robot trajectories.

View on arXiv
Comments on this paper