Abstract

Autonomous unmanned ground vehicle is a product of technological development. At present, it has become a trend to provide multi-view real-time environmental state for autonomous unmanned ground vehicles. And more and more object detection algorithms will be applied in autonomous driving. In the recent years, multi-sensor-based driving assistance systems have become a way to solve these problems. In this paper, we introduce a driving assistance system based on data fusion of multiple sensors for autonomous driving. On the one hand, six fisheye cameras and twelve ultrasonic radars are used to collect surround-view environmental state. On the other hand, this system uses a low-light camera and a lidar to provide forward-view environmental state. In the meantime, we design panoramic mosaic algorithm, surround-view data fusion algorithm and forward-view data fusion algorithm. Besides, object detection algorithm is used to provide forward-view detection results. This driving assistance system will advance research into autonomous unmanned ground vehicles.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call