ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.18308
60
0

Vision-Guided Loco-Manipulation with a Snake Robot

24 March 2025
Adarsh Salagame
Sasank Potluri
Keshav Bharadwaj Vaidyanathan
Kruthika Gangaraju
Eric N. Sihite
Milad Ramezani
Alireza Ramezani
ArXivPDFHTML
Abstract

This paper presents the development and integration of a vision-guided loco-manipulation pipeline for Northeastern University's snake robot, COBRA. The system leverages a YOLOv8-based object detection model and depth data from an onboard stereo camera to estimate the 6-DOF pose of target objects in real time. We introduce a framework for autonomous detection and control, enabling closed-loop loco-manipulation for transporting objects to specified goal locations. Additionally, we demonstrate open-loop experiments in which COBRA successfully performs real-time object detection and loco-manipulation tasks.

View on arXiv
@article{salagame2025_2503.18308,
  title={ Vision-Guided Loco-Manipulation with a Snake Robot },
  author={ Adarsh Salagame and Sasank Potluri and Keshav Bharadwaj Vaidyanathan and Kruthika Gangaraju and Eric Sihite and Milad Ramezani and Alireza Ramezani },
  journal={arXiv preprint arXiv:2503.18308},
  year={ 2025 }
}
Comments on this paper