Abstract
Road detection is one of the key challenges for autonomous vehicles. Two kinds of sensors are commonly used for road detection: cameras and LIDARs. However, each of them suffers from some inherent drawbacks. Thus, sensor fusion is commonly used to combine the merits of these two kinds of sensors. Nevertheless, current sensor fusion methods are dominated by either cameras or LIDARs rather than making the best of both. In this paper, we extend the conditional random field (CRF) model and propose a novel hybrid CRF model to fuse the information from camera and LIDAR. After aligning the LIDAR points and pixels, we take the labels (either road or background) of the pixels and LIDAR points as random variables and infer the labels via minimization of a hybrid energy function. Boosted decision tree classifiers are learned to predict the unary potentials of both the pixels and LIDAR points. The pairwise potentials in the hybrid model encode (i) the contextual consistency in the image, (ii) the contextual consistency in the point cloud, and (iii) the cross-modal consistency between the aligned pixels and LIDAR points. This model integrates the information from the two sensors in a probabilistic way and makes good use of both sensors. The hybrid CRF model can be optimized efficiently with graph cuts to get road areas. Extensive experiments have been conducted on the KITTI-ROAD benchmark dataset and the experimental results show that the proposed method outperforms the current methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.