Robotic-assisted procedures offer enhanced precision, but while fully autonomous systems are limited in task knowledge, difficulties in modeling unstructured environments, and generalisation abilities, fully manual teleoperated systems also face challenges such as delay, stability, and reduced sensory information. To address these, we developed an interactive control strategy that assists the human operator by predicting their motion plan at both high and low levels. At the high level, a surgeme recognition system is employed through a Transformer-based real-time gesture classification model to dynamically adapt to the operator's actions, while at the low level, a Confidence-based Intention Assimilation Controller adjusts robot actions based on user intent and shared control paradigms. The system is built around a robotic suturing task, supported by sensors that capture the kinematics of the robot and task dynamics. Experiments across users with varying skill levels demonstrated the effectiveness of the proposed approach, showing statistically significant improvements in task completion time and user satisfaction compared to traditional teleoperation.
View on arXiv@article{hu2025_2504.20761, title={ Confidence-based Intent Prediction for Teleoperation in Bimanual Robotic Suturing }, author={ Zhaoyang Jacopo Hu and Haozheng Xu and Sion Kim and Yanan Li and Ferdinando Rodriguez y Baena and Etienne Burdet }, journal={arXiv preprint arXiv:2504.20761}, year={ 2025 } }