Abstract
Dynamic obstacle detection is important for environmental perception in self-driving cars. Instance segmentation using a camera is a major trend in obstacle detection. However, unexpected dynamic obstacles are difficult to detect as their classes are unlabeled in the model. In this study, we combine an understanding of a road scene; optical flow movement tracking; and low-cost online visual tracking to build a system for detecting unexpected dynamic obstacles. To monitor the pixel movement, a mobile recurrent pairwise decoding optical flow deep neural network is employed to rapidly track the pixel flows between two frames. To filter background noises and leave the active region on the road, a mobile DABNet detects the targets (only roads and vehicles) in the scene. To reduce the load on the GPU, a cluster-matching tracker employs multi-tensor CPU resources to follow the estimated unexpected dynamic obstacles extracted by processing based on the road understanding and optical flows and tracks such obstacles one by one in the following frames. A real-time system properly splits the usage of GPU and CPU resources to maximize the performance of the system platform. To evaluate the efficiency, a driver view video dataset is recorded for evaluating real-world obstacles on the urban road scene. Then, animal crash videos are collected from YouTube to evaluate unexpected/rarely labeled objects. Furthermore, a mobile robot platform is used to test the proposed system to avoid obstacles in a complicated indoor scene.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.