ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.03103
65
0

DyTact: Capturing Dynamic Contacts in Hand-Object Manipulation

3 June 2025
Xiaoyan Cong
Angela Xing
Chandradeep Pokhariya
Rao Fu
Srinath Sridhar
ArXiv (abs)PDFHTML
Main:9 Pages
8 Figures
Bibliography:3 Pages
3 Tables
Abstract

Reconstructing dynamic hand-object contacts is essential for realistic manipulation in AI character animation, XR, and robotics, yet it remains challenging due to heavy occlusions, complex surface details, and limitations in existing capture techniques. In this paper, we introduce DyTact, a markerless capture method for accurately capturing dynamic contact in hand-object manipulations in a non-intrusive manner. Our approach leverages a dynamic, articulated representation based on 2D Gaussian surfels to model complex manipulations. By binding these surfels to MANO meshes, DyTact harnesses the inductive bias of template models to stabilize and accelerate optimization. A refinement module addresses time-dependent high-frequency deformations, while a contact-guided adaptive sampling strategy selectively increases surfel density in contact regions to handle heavy occlusion. Extensive experiments demonstrate that DyTact not only achieves state-of-the-art dynamic contact estimation accuracy but also significantly improves novel view synthesis quality, all while operating with fast optimization and efficient memory usage. Project Page:this https URL.

View on arXiv
@article{cong2025_2506.03103,
  title={ DyTact: Capturing Dynamic Contacts in Hand-Object Manipulation },
  author={ Xiaoyan Cong and Angela Xing and Chandradeep Pokhariya and Rao Fu and Srinath Sridhar },
  journal={arXiv preprint arXiv:2506.03103},
  year={ 2025 }
}
Comments on this paper