Abstract
Detecting obstacles accurately is a vital part of realizing autonomous driving. But utilizing single sensors, for example, RGB camera or LiDAR itself, is difficult to meet the needs of autonomous driving under the condition of sparse environmental sensing data. In contrast, equipments that can collect data in large quantities are expensive and difficult to be utilized in practical applications. Hence, under the premise of ensuring low application costs, the difficult task is to obtain high-precision three-dimensional information of obstacles in real-time from sparse data. To overcome this problem, a real-time detection technique that combines LiDAR data and image information for obstacles detection is proposed in this article. It can detect vehicles close to the ART in sparse data in real-time. The sparse point cloud is de-noised initially and the ground is separated from the initial cloud data. The remained cloud, which includes surrounding vehicles, are processed by the clustering algorithm according to the structural relationship and scanning mechanism of the LiDAR. Then the processed LiDAR data is fused with the classification information generated from images based on a neural network to locate suspicious obstacles. The ratio values between the central values of vehicles and image width are obtained in the image coordinate system. Then the category information and ratio values are mapped on the processed LiDAR clusters during ART operation. The method is tested and verified in the real field. The evaluation results show that the proposed method can achieve an average accuracy of 85.5% for obstacle classification with a cost-effective sensing suit.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Intelligent Transportation Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.