Abstract

AbstractLidar and visual data are affected heavily in adverse weather conditions due to sensing mechanisms, which bring potential safety hazards for vehicle navigation. Radar sensing is desirable to build a more robust navigation system. In this paper, a cross‐modality radar localisation on prior lidar maps is presented. Specifically, the proposed workflow consists of two parts: first, bird's‐eye‐view radar images are transferred to fake lidar images by training a generative adversarial network offline. Then with online radar scans, a Monte Carlo localisation framework is built to track the robot pose on lidar maps. The whole online localisation system only needs a rotating radar sensor and a pre‐built global lidar map. In the experimental section, the authors conduct an ablation study on image settings and test the proposed system on Oxford Radar Robot Car Dataset. The promising results show that the proposed localisation system could track the robot pose successfully, thus demonstrating the feasibility of radar style transfer for metric robot localisation on lidar maps.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call