ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.19169
58
0
v1v2v3 (latest)

EventEgoHands: Event-based Egocentric 3D Hand Mesh Reconstruction

25 May 2025
Ryosei Hara
Wataru Ikeda
Masashi Hatano
Mariko Isogawa
ArXiv (abs)PDFHTML
Main:5 Pages
8 Figures
Bibliography:1 Pages
3 Tables
Appendix:3 Pages
Abstract

Reconstructing 3D hand mesh is challenging but an important task for human-computer interaction and AR/VR applications. In particular, RGB and/or depth cameras have been widely used in this task. However, methods using these conventional cameras face challenges in low-light environments and during motion blur. Thus, to address these limitations, event cameras have been attracting attention in recent years for their high dynamic range and high temporal resolution. Despite their advantages, event cameras are sensitive to background noise or camera motion, which has limited existing studies to static backgrounds and fixed cameras. In this study, we propose EventEgoHands, a novel method for event-based 3D hand mesh reconstruction in an egocentric view. Our approach introduces a Hand Segmentation Module that extracts hand regions, effectively mitigating the influence of dynamic background events. We evaluated our approach and demonstrated its effectiveness on the N-HOT3D dataset, improving MPJPE by approximately more than 4.5 cm (43%).

View on arXiv
@article{hara2025_2505.19169,
  title={ EventEgoHands: Event-based Egocentric 3D Hand Mesh Reconstruction },
  author={ Ryosei Hara and Wataru Ikeda and Masashi Hatano and Mariko Isogawa },
  journal={arXiv preprint arXiv:2505.19169},
  year={ 2025 }
}
Comments on this paper