Abstract
Stereo vision is a straightforward approach for 3D information perception. However image correspondence is the most difficulty in real-time applications. In this paper a qualitative 3D scene verification method is proposed. The global correspondence problem is simplified to an optimization for disparity map parameters. The results of fitting provide the camera pose parameters. The obstacles are then detected from the abnormality of disparity. As we known, simple two-dimensional vision based road following methods are not sufficient for Autonomous Land Vehicle (ALV) navigation in complex environments under arbitrary weather and illumination conditions. They can not recognize 3D obstacles efficiently, and may detect the shadow or water on the road as false obstacles. In this paper the novel qualitative stereo-vision method we proposed is designed for real-time obstacle detection on structural roads. The optimal disparity map of the image pair can be easily computed, and it is a linear function on the image plane. Road scene is verified by the optimal disparity map. A morphology procedure is applied for more reliable abnormal-disparity region extraction. These regions are the focus-of-attention parts for following processing, as obstacle avoidance or object recognition. The algorithm is not correspondence related, so it is very efficient and can be implemented in real-time. Experiments show that this approach is simple and robust. Transactions on Information and Communications Technologies vol 16, © 1996 WIT Press, www.witpress.com, ISSN 1743-3517
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: WIT Transactions on Information and Communication Technologies
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.