ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.13777
15
0

Explainable Interfaces for Rapid Gaze-Based Interactions in Mixed Reality

21 April 2024
Mengjie Yu
Dustin Harris
Ian Jones
Ting Zhang
Yue Liu
Naveen Sendhilnathan
Narine Kokhlikyan
Fulton Wang
Co Tran
Jordan L. Livingston
Krista E. Taylor
Zhenhong Hu
Mary A. Hood
Hrvoje Benko
Tanya R. Jonker
ArXivPDFHTML
Abstract

Gaze-based interactions offer a potential way for users to naturally engage with mixed reality (XR) interfaces. Black-box machine learning models enabled higher accuracy for gaze-based interactions. However, due to the black-box nature of the model, users might not be able to understand and effectively adapt their gaze behaviour to achieve high quality interaction. We posit that explainable AI (XAI) techniques can facilitate understanding of and interaction with gaze-based model-driven system in XR. To study this, we built a real-time, multi-level XAI interface for gaze-based interaction using a deep learning model, and evaluated it during a visual search task in XR. A between-subjects study revealed that participants who interacted with XAI made more accurate selections compared to those who did not use the XAI system (i.e., F1 score increase of 10.8%). Additionally, participants who used the XAI system adapted their gaze behavior over time to make more effective selections. These findings suggest that XAI can potentially be used to assist users in more effective collaboration with model-driven interactions in XR.

View on arXiv
Comments on this paper