ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.04055
8
32

Visualizing Robot Intent for Object Handovers with Augmented Reality

6 March 2021
Rhys Newbury
Akansel Cosgun
Tysha Crowley-Davis
Wesley P. Chan
Tom Drummond
Elizabeth A. Croft
ArXivPDFHTML
Abstract

Humans are highly skilled in communicating their intent for when and where a handover would occur. However, even the state-of-the-art robotic implementations for handovers typically lack of such communication skills. This study investigates visualization of the robot's internal state and intent for Human-to-Robot Handovers using Augmented Reality. Specifically, we explore the use of visualized 3D models of the object and the robotic gripper to communicate the robot's estimation of where the object is and the pose in which the robot intends to grasp the object. We tested this design via a user study with 16 participants, in which each participant handed over a cube-shaped object to the robot 12 times. Results show communicating robot intent via augmented reality substantially improves the perceived experience of the users for handovers. Results also indicate that the effectiveness of augmented reality is even more pronounced for the perceived safety and fluency of the interaction when the robot makes errors in localizing the object.

View on arXiv
Comments on this paper