Abstract

Abstract The novel framework for estimating dense scene flow using depth camera data is demonstrated in the article. Using these estimated flow vectors to identify obstacles improves the path planning module of the autonomous vehicle's (AV) intelligence. The primary difficulty in the development of AVs has been thought to be path planning in cluttered environments. These vehicles must possess the intelligence to recognize their surroundings and successfully navigate around obstacles. The AV needs a thorough understanding of the surroundings to detect and avoid obstacles in a cluttered environment. Therefore, when determining the course, it is preferable to be aware of the kinematic behavior (position and the direction) of the obstacles. As a result, by comparing the depth images between different time frames, the position and direction of the obstacles are calculated using a 3D vision sensor. The current study focuses on the extraction of the flow vectors in 3D coordinates from the differential scene flow method. Generally, the evaluation of scene flow algorithms is crucial in determining their accuracy and effectiveness in different applications. The gradient of the vector field snake model, which extracts changes in pixel values in three directions, is combined with the scene flow technique to identify both static and dynamic obstacles. Our goal is to create a single-vision sensor-based real-time obstacle avoidance method based on scene flow estimation. In addition, the common evaluation metrics such as endpoint error (EPE), average angular error (AAE), and standard deviation angular error (STDAE) are used to measure the accuracy of different algorithms in terms of computational errors with the benchmark Middlebury datasets. The proposed technique is validated with different experiments using a Pixel-Mixed-Device (PMD) camera and a Kinect sensor as 3D sensors. Finally, the numerical and experimental results are displayed and reported.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.