ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.03929
21
0

MIHRaGe: A Mixed-Reality Interface for Human-Robot Interaction via Gaze-Oriented Control

6 May 2025
Rafael R. Baptista
Nina R. Gerszberg
Ricardo V. Godoy
G. J. G. Lahr
ArXivPDFHTML
Abstract

Individuals with upper limb mobility impairments often require assistive technologies to perform activities of daily living. While gaze-tracking has emerged as a promising method for robotic assistance, existing solutions lack sufficient feedback mechanisms, leading to uncertainty in user intent recognition and reduced adaptability. This paper presents the MIHRAGe interface, an integrated system that combines gaze-tracking, robotic assistance, and a mixed-reality to create an immersive environment for controlling the robot using only eye movements. The system was evaluated through an experimental protocol involving four participants, assessing gaze accuracy, robotic positioning precision, and the overall success of a pick and place task. Results showed an average gaze fixation error of 1.46 cm, with individual variations ranging from 1.28 cm to 2.14 cm. The robotic arm demonstrated an average positioning error of +-1.53 cm, with discrepancies attributed to interface resolution and calibration constraints. In a pick and place task, the system achieved a success rate of 80%, highlighting its potential for improving accessibility in human-robot interaction with visual feedback to the user.

View on arXiv
@article{baptista2025_2505.03929,
  title={ MIHRaGe: A Mixed-Reality Interface for Human-Robot Interaction via Gaze-Oriented Control },
  author={ Rafael R. Baptista and Nina R. Gerszberg and Ricardo V. Godoy and Gustavo J. G. Lahr },
  journal={arXiv preprint arXiv:2505.03929},
  year={ 2025 }
}
Comments on this paper