Abstract

Autonomous driving technology relies on high-resolution, high-performance vision sensing technology. As the two most popular sensors in the industry, Lidar and stereo cameras play an important role in sensing, detection, control and planning decisions. At this stage, the Lidar can obtain more accurate depth map information, but with the increase of resolution, its cost increases sharply and cannot be applied in large scale in the market. At the same time, the depth detection scheme based on computer vision has high resolution, but the data accuracy is very low. In order to solve the practical application defects of the above two sensors, this paper proposes a new sensor fusion method for stereo camera and low-resolution Lidar, which has high resolution, high performance and low cost. This paper adopts new sensor design, multi-sensor calibration, classification and selection of Lidar pointcloud features, large-scale and efficient stereo image matching and depth map calculation, and method of filling missing information based on pointcloud segmentation. To verify the effectiveness of the method in this paper, this paper used high-resolution Lidar as ground truth for comparison. The results show that the fusion method can achieve an average improved accuracy of 30% within a close range of 30 meters, while achieving 98% resolution. In addition, this paper presents a scheme for visualizing multi-sensor image fusion, and encapsulates five modules: multi-sensor calibration, large-scale stereo depth calculation, low-resolution Lidar simulation, sensor data fusion, and fusion image and error visualization, in order to facilitate future secondary development.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.