Abstract
This paper proposes a forward collision warning system for an autonomous vehicle based on a novel point to pixel multi-sensor data fusion algorithm which combines both the LIDAR 2D data and the image pixel data from the stereo camera to detect, classify and track the obstacles in front of the vehicle in real time. The LIDAR and stereo camera sensors were synchronized and calibrated, then obtained distance measurements from both the sensors were combined using Kalman filter algorithm performing multi-sensor data fusion in real time on an embedded platform. The region of interest (ROI) was selected from the camera image, and then the fused distance data was overlaid on top of the contour. Distance and angle of the target are obtained from the LIDAR, and target classification was performed by applying MobileNet SSD deep learning algorithm to camera data. The root mean squared error (RMSE) and mean absolute error (MAE) of the proposed fusion algorithm are 93.802 mm and 83.453 mm lower than the individual distance measurements from Stereo Camera and 2D LIDAR Sensor. Along with that uncertainty and variance of the fused measurements were also decreased.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have