ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.18869
60
0
v1v2 (latest)

Eye-See-You: Reverse Pass-Through VR and Head Avatars

24 May 2025
Ankan Dash
Jingyi Gu
Guiling Wang
Chen Chen
ArXiv (abs)PDFHTML
Main:6 Pages
8 Figures
6 Tables
Appendix:3 Pages
Abstract

Virtual Reality (VR) headsets, while integral to the evolving digital ecosystem, present a critical challenge: the occlusion of users' eyes and portions of their faces, which hinders visual communication and may contribute to social isolation. To address this, we introduce RevAvatar, an innovative framework that leverages AI methodologies to enable reverse pass-through technology, fundamentally transforming VR headset design and interaction paradigms. RevAvatar integrates state-of-the-art generative models and multimodal AI techniques to reconstruct high-fidelity 2D facial images and generate accurate 3D head avatars from partially observed eye and lower-face regions. This framework represents a significant advancement in AI4Tech by enabling seamless interaction between virtual and physical environments, fostering immersive experiences such as VR meetings and social engagements. Additionally, we present VR-Face, a novel dataset comprising 200,000 samples designed to emulate diverse VR-specific conditions, including occlusions, lighting variations, and distortions. By addressing fundamental limitations in current VR systems, RevAvatar exemplifies the transformative synergy between AI and next-generation technologies, offering a robust platform for enhancing human connection and interaction in virtual environments.

View on arXiv
@article{dash2025_2505.18869,
  title={ Eye-See-You: Reverse Pass-Through VR and Head Avatars },
  author={ Ankan Dash and Jingyi Gu and Guiling Wang and Chen Chen },
  journal={arXiv preprint arXiv:2505.18869},
  year={ 2025 }
}
Comments on this paper