196

EndoSensorFusion: Particle Filtering-Based Multi-sensory Data Fusion with Switching State-Space Model for Endoscopic Capsule Robots using Recurrent Neural Network Kinematics

Abstract

A reliable, real time multi-sensor fusion functionality is crucial for localization of actively controlled next-generation endoscopic capsule robots, as an emerging minimally invasive diagnostic technology for the inspection of gastrointestinal (GI) tract and diagnosis of a wide range of diseases and pathologies. In this study, we propose a novel multi-sensor fusion approach based on switching observations model using non-linear kinematics learned by recurrent neural networks for real-time endoscopic capsule robot localization. Our method concerns the sequential estimation of a hidden state vector from noisy pose observations delivered by multiple sensors, a 5 degree-of-freedom (5-DoF) absolute pose estimation by magnetic 2D Hall-effect sensor array and a 6-DoF relative pose estimation by a visual odometry approach. In addition, the proposed method is capable of detecting and handling sensor failures in-between nominal sensor states. Detailed analyses and evaluations made using ex-vivo experiments on a porcine stomach model prove that our system achieves high translational and rotational accuracies for different types of endoscopic capsule robot trajectories.

View on arXiv
Comments on this paper