415

Multi-Resolution POMDP Planning for Multi-Object Search in 3D

IEEE/RJS International Conference on Intelligent RObots and Systems (IROS), 2020
Abstract

Robots operating in household environments must find objects on shelves, under tables, and in cupboards. Previous work often formulates the object search problem as a POMDP Partially Observable Markov Decision Process), yet constrain the search space in 2D to reduce computational complexity, although objects exist in a rich 3D environment. We present a POMDP formulation for multi-object search in a 3D region with a frustum-shaped field-of-view and an efficient multi-resolution planning algorithm to solve this POMDP. To achieve efficient planning, our algorithm uses a new octree-based representation that captures beliefs at different resolution levels, enabling the agent to induce abstract POMDPs with dramatically smaller state and observation spaces. Our evaluation in a simulated 3D domain shows that our approach achieves significantly higher reward (\geq 51% in the largest instance) and finds more objects compared to baselines without a resolution hierarchy, as the search space becomes larger, and as the sensor uncertainty increases. We show that our approach enables a mobile robot to automatically find objects placed at different heights in two 10m2×^2\times2m regions by moving its base and actuating its torso.

View on arXiv
Comments on this paper