OF-VO: Reliable Navigation among Pedestrians Using Commodity Sensors
We present a novel algorithm for safe navigation of a mobile robot in uncertain environment among pedestrians. Our approach uses commodity visual sensors, including mono-camera and a 2D lidar, for explicitly predicting the velocities and positions of surrounding obstacles through optical flow estimation, object detection and sensor fusion. Given these probabilistic partial observations of the environment, we present a modified velocity-obstacle (VO) algorithm to compute velocities to navigate robot as it approaches to target. A key aspect of our work is coupling the perception (OF: optical flow) and planning (VO) components for reliable navigation. Overall, our OF-VO algorithm is a hybrid combination of learning-based and model-based methods and offers better performance than prior algorithms in terms of navigation time and success rate of collision avoidance. We highlight the realtime performance of OF-VO in simulated and real-world dynamic scenes on a Turtlebot robot navigating among pedestrians with commodity sensors. A demo video is available at https://www.youtube.com/watch?v=5sYhZrGwsxM
View on arXiv