ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.00837
29
0

Improving Multi-Vehicle Perception Fusion with Millimeter-Wave Radar Assistance

1 June 2025
Zhiqing Luo
Yi Wang
Yingying He
Wei Wang
ArXiv (abs)PDFHTML
Main:9 Pages
17 Figures
Bibliography:1 Pages
Abstract

Cooperative perception enables vehicles to share sensor readings and has become a new paradigm to improve driving safety, where the key enabling technology for realizing this vision is to real-time and accurately align and fuse the perceptions. Recent advances to align the views rely on high-density LiDAR data or fine-grained image feature representations, which however fail to meet the requirements of accuracy, real-time, and adaptability for autonomous driving. To this end, we present MMatch, a lightweight system that enables accurate and real-time perception fusion with mmWave radar point clouds. The key insight is that fine-grained spatial information provided by the radar present unique associations with all the vehicles even in two separate views. As a result, by capturing and understanding the unique local and global position of the targets in this association, we can quickly find out all the co-visible vehicles for view alignment. We implement MMatch on both the datasets collected from the CARLA platform and the real-world traffic with over 15,000 radar point cloud pairs. Experimental results show that MMatch achieves decimeter-level accuracy within 59ms, which significantly improves the reliability for autonomous driving.

View on arXiv
@article{luo2025_2506.00837,
  title={ Improving Multi-Vehicle Perception Fusion with Millimeter-Wave Radar Assistance },
  author={ Zhiqing Luo and Yi Wang and Yingying He and Wei Wang },
  journal={arXiv preprint arXiv:2506.00837},
  year={ 2025 }
}
Comments on this paper