ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.01941
51
0

FreeTacMan: Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation

2 June 2025
Longyan Wu
Checheng Yu
Jieji Ren
Li Chen
Ran Huang
Guoying Gu
Hongyang Li
ArXiv (abs)PDFHTML
Abstract

Enabling robots with contact-rich manipulation remains a pivotal challenge in robot learning, which is substantially hindered by the data collection gap, including its inefficiency and limited sensor setup. While prior work has explored handheld paradigms, their rod-based mechanical structures remain rigid and unintuitive, providing limited tactile feedback and posing challenges for human operators. Motivated by the dexterity and force feedback of human motion, we propose FreeTacMan, a human-centric and robot-free data collection system for accurate and efficient robot manipulation. Concretely, we design a wearable data collection device with dual visuo-tactile grippers, which can be worn by human fingers for intuitive and natural control. A high-precision optical tracking system is introduced to capture end-effector poses, while synchronizing visual and tactile feedback simultaneously. FreeTacMan achieves multiple improvements in data collection performance compared to prior works, and enables effective policy learning for contact-rich manipulation tasks with the help of the visuo-tactile information. We will release the work to facilitate reproducibility and accelerate research in visuo-tactile manipulation.

View on arXiv
@article{wu2025_2506.01941,
  title={ FreeTacMan: Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation },
  author={ Longyan Wu and Checheng Yu and Jieji Ren and Li Chen and Ran Huang and Guoying Gu and Hongyang Li },
  journal={arXiv preprint arXiv:2506.01941},
  year={ 2025 }
}
Main:8 Pages
11 Figures
Bibliography:3 Pages
7 Tables
Appendix:7 Pages
Comments on this paper