To address the challenge of autonomous UGV localization in GNSS-denied off-road environments,this study proposes a matching-based localization method that leverages BEV perception image and satellite map within a road similarity space to achieve high-precisionthis http URLfirst implement a robust LiDAR-inertial odometry system, followed by the fusion of LiDAR and image data to generate a local BEV perception image of the UGV. This approach mitigates the significant viewpoint discrepancy between ground-view images and satellite map. The BEV image and satellite map are then projected into the road similarity space, where normalized cross correlation (NCC) is computed to assess the matchingthis http URL, a particle filter is employed to estimate the probability distribution of the vehicle'sthis http URLcomparing with GNSS ground truth, our localization system demonstrated stability without divergence over a long-distance test of 10 km, achieving an average lateral error of only 0.89 meters and an average planar Euclidean error of 3.41 meters. Furthermore, it maintained accurate and stable global localization even under nighttime conditions, further validating its robustness and adaptability.
View on arXiv@article{sun2025_2504.16346, title={ Road Similarity-Based BEV-Satellite Image Matching for UGV Localization }, author={ Zhenping Sun and Chuang Yang and Yafeng Bu and Bokai Liu and Jun Zeng and Xiaohui Li }, journal={arXiv preprint arXiv:2504.16346}, year={ 2025 } }