Abstract

Advanced driver assistance systems improve driving safety and comfort by applying onboard sensors to collect environmental data, analyze environmental data and decision making. Therefore, advanced driver assistance systems have high requirements for distance perception of the environment. Perceptual sensors commonly used in traditional solutions include stereo vision sensors and the Light Detection and Ranging (LiDAR) sensors. This paper proposes a multi-sensing sensor fusion method for disparity estimation, which combines the perceptual data density characteristics of stereo vision sensors and the measurement accuracy characteristics of LiDAR sensors. The method enhances the sensing accuracy by ensuring high-density sense, which is suitable for distance sensing tasks in complex environments. This paper demonstrates with experimental results on real data that our proposed disparity estimation method performs well and is robust in different scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call