0
0

3D Vision-tactile Reconstruction from Infrared and Visible Images for Robotic Fine-grained Tactile Perception

Yuankai Lin
Xiaofan Lu
Jiahui Chen
Hua Yang
Main:5 Pages
8 Figures
Bibliography:2 Pages
3 Tables
Abstract

To achieve human-like haptic perception in anthropomorphic grippers, the compliant sensing surfaces of vision tactile sensor (VTS) must evolve from conventional planar configurations to biomimetically curved topographies with continuous surface gradients. However, planar VTSs have challenges when extended to curved surfaces, including insufficient lighting of surfaces, blurring in reconstruction, and complex spatial boundary conditions for surface structures. With an end goal of constructing a human-like fingertip, our research (i) develops GelSplitter3D by expanding imaging channels with a prism and a near-infrared (NIR) camera, (ii) proposes a photometric stereo neural network with a CAD-based normal ground truth generation method to calibrate tactile geometry, and (iii) devises a normal integration method with boundary constraints of depth prior information to correcting the cumulative error of surface integrals. We demonstrate better tactile sensing performance, a 40%\% improvement in normal estimation accuracy, and the benefits of sensor shapes in grasping and manipulation tasks.

View on arXiv
@article{lin2025_2506.15087,
  title={ 3D Vision-tactile Reconstruction from Infrared and Visible Images for Robotic Fine-grained Tactile Perception },
  author={ Yuankai Lin and Xiaofan Lu and Jiahui Chen and Hua Yang },
  journal={arXiv preprint arXiv:2506.15087},
  year={ 2025 }
}
Comments on this paper