ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.04451
55
29
v1v2 (latest)

Appearance-Based Gaze Estimation via Gaze Decomposition and Single Gaze Point Calibration

11 May 2019
Zhaokang Chen
Bertram E. Shi
ArXiv (abs)PDFHTML
Abstract

Appearance-based gaze estimation provides relatively unconstrained gaze tracking. However, subject-indepen\-dent models achieve limited accuracy partly due to individual variations. To improve estimation, we propose a novel gaze decomposition method and a single gaze point calibration method, motivated by our finding that the inter-subject squared bias exceeds the intra-subject variance for a subject-independent estimator. We decompose the gaze angle into a subject-dependent bias term and a subject-independent difference term between the gaze angle and the bias. The difference term is estimated by a deep convolutional network. For calibration-free tracking, we set the subject-dependent bias term to zero. For single gaze point calibration, we estimate the bias from a few images taken as the subject gazes at a point. Experiments on three datasets indicate that as a calibration-free estimator, the proposed method outperforms the state-of-the-art methods that use single model by up to 10.0%10.0\%10.0%. The proposed calibration method is robust and reduces estimation error significantly (up to 35.6%35.6\%35.6%), achieving state-of-the-art performance for appearance-based eye trackers with calibration.

View on arXiv
Comments on this paper