Abstract

Vision, Radar, and LiDAR sensors are widely used for autonomous vehicle perception technology. Especially object detection and classification are primarily dependent on vision sensors. However, under poor lighting conditions, dazzling sunlight, or bad weathers an object might be difficult to be identified with general vision sensors. In this paper, we propose a sensor fusion system with a thermal infrared camera and LiDAR sensor that can reliably detect and identify objects even in environments where visibility is poor, such as in severe glare and fog or smoke. The proposed method obtains intrinsic parameters by calibrating the thermal infrared camera and LiDAR sensor. Extrinsic calibration algorithm between two sensors is made to obtain the extrinsic parameters (rotation and translation matrix) using 3D calibration targets. This system and proposed algorithm show that it can reliably detect and identify objects even in hard visibility environments, such as in severe glare due to direct sunlight or headlights or in low visibility environments, such as in severe fog or smoke.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call