ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.00255
29
2

Measuring eye-tracking accuracy and its impact on usability in apple vision pro

1 June 2024
Zehao Huang
Gancheng Zhu
Xiaoting Duan
Rong Wang
Yongkai Li
Shuai Zhang
Zhiguo Wang
ArXivPDFHTML
Abstract

With built-in eye-tracking cameras, the recently released Apple Vision Pro (AVP) mixed reality (MR) headset features gaze-based interaction, eye image rendering on external screens, and iris recognition for device unlocking. One of the technological advancements of the AVP is its heavy reliance on gaze- and gesture-based interaction. However, limited information is available regarding the technological specifications of the eye-tracking capability of the AVP, and raw gaze data is inaccessible to developers. This study evaluates the eye-tracking accuracy of the AVP with two sets of tests spanning both MR and virtual reality (VR) applications. This study also examines how eye-tracking accuracy relates to user-reported usability. The results revealed an overall eye-tracking accuracy of 1.11{\deg} and 0.93{\deg} in two testing setups, within a field of view (FOV) of approximately 34{\deg} x 18{\deg}. The usability and learnability scores of the AVP, measured using the standard System Usability Scale (SUS), were 75.24 and 68.26, respectively. Importantly, no statistically reliable correlation was found between eye-tracking accuracy and usability scores. These results suggest that eye-tracking accuracy is critical for gaze-based interaction, but it is not the sole determinant of user experience in VR/AR.

View on arXiv
Comments on this paper