ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.13863
56
1

The NavINST Dataset for Multi-Sensor Autonomous Navigation

20 February 2025
Paulo Ricardo Marques de Araujo
Eslam Mounier
Qamar Bader
Emma Dawson
Shaza I. Kaoud Abdelaziz
Ahmed Zekry
Mohamed Elhabiby
Aboelmagd Noureldin
ArXivPDFHTML
Abstract

The NavINST Laboratory has developed a comprehensive multisensory dataset from various road-test trajectories in urban environments, featuring diverse lighting conditions, including indoor garage scenarios with dense 3D maps. This dataset includes multiple commercial-grade IMUs and a high-end tactical-grade IMU. Additionally, it contains a wide array of perception-based sensors, such as a solid-state LiDAR - making it one of the first datasets to do so - a mechanical LiDAR, four electronically scanning RADARs, a monocular camera, and two stereo cameras. The dataset also includes forward speed measurements derived from the vehicle's odometer, along with accurately post-processed high-end GNSS/IMU data, providing precise ground truth positioning and navigation information. The NavINST dataset is designed to support advanced research in high-precision positioning, navigation, mapping, computer vision, and multisensory fusion. It offers rich, multi-sensor data ideal for developing and validating robust algorithms for autonomous vehicles. Finally, it is fully integrated with the ROS, ensuring ease of use and accessibility for the research community. The complete dataset and development tools are available atthis https URL.

View on arXiv
@article{araujo2025_2502.13863,
  title={ The NavINST Dataset for Multi-Sensor Autonomous Navigation },
  author={ Paulo Ricardo Marques de Araujo and Eslam Mounier and Qamar Bader and Emma Dawson and Shaza I. Kaoud Abdelaziz and Ahmed Zekry and Mohamed Elhabiby and Aboelmagd Noureldin },
  journal={arXiv preprint arXiv:2502.13863},
  year={ 2025 }
}
Comments on this paper