Abstract

The evolution of Advanced Driver Assistance Systems (ADAS) towards the ultimate goal of autonomous driving relies on a conspicuous number of sensors, to perform a wide range of operations, from parking assistance to emergency braking and environment mapping for target recognition/classification. Low-cost Mass-Market Radars (MMRs) are today widely used for object detection at various ranges (up to 250 meters) but they might not be suited for high-precision environment mapping. In this context, vehicular Synthetic Aperture Radar (SAR) is emerging as a promising technique to augment radar imaging capability by exploiting the vehicle motion to provide two-dimensional (2D), or even three-dimensional (3D), images of the surroundings. SAR has a higher resolution compared to standard automotive radars, provided that motion is precisely known. In this regard, one of the most attractive solutions to increase the positioning accuracy is to fuse the information from multiple on-board sensors, such as Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), odometers and steering angle sensors. This paper proposes a multi-sensor fusion technique to support automotive SAR systems, experimentally validating the approach and demonstrating its advantages compared to standard navigation solutions. The results show that multi-sensor-aided SAR images the surrounding with centimeter-level accuracy over typical urban trajectories, confirming its potential for practical applications and leaving room for further improvements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call