Advances in Compliance Detection: Novel Models Using Vision-Based Tactile Sensors

Compliance is a critical parameter for describing objects in engineering, agriculture, and biomedical applications. Traditional compliance detection methods are limited by their lack of portability and scalability, rely on specialized, often expensive equipment, and are unsuitable for robotic applications. Moreover, existing neural network-based approaches using vision-based tactile sensors still suffer from insufficient prediction accuracy. In this paper, we propose two models based on Long-term Recurrent Convolutional Networks (LRCNs) and Transformer architectures that leverage RGB tactile images and other information captured by the vision-based sensor GelSight to predict compliance metrics accurately. We validate the performance of these models using multiple metrics and demonstrate their effectiveness in accurately estimating compliance. The proposed models exhibit significant performance improvement over the baseline. Additionally, we investigated the correlation between sensor compliance and object compliance estimation, which revealed that objects that are harder than the sensor are more challenging to estimate.
View on arXiv@article{li2025_2506.14980, title={ Advances in Compliance Detection: Novel Models Using Vision-Based Tactile Sensors }, author={ Ziteng Li and Malte Kuhlmann and Ilana Nisky and Nicolás Navarro-Guerrero }, journal={arXiv preprint arXiv:2506.14980}, year={ 2025 } }