Abstract

This paper presents a method for estimating the position and velocity of a moving obstacle in a moving vehicle. In most stereo vision systems, an obstacle’s position is calculated by triangulation or an inverse perspective map (IPM) approach. However, measurement errors increase at long range due to quantization errors and matching errors. The key point that reduces measurement errors is to estimate the disparity accurately and precisely. This article focuses on the improvement in precision because accuracy can be enhanced through a calibration process in most measurement systems. The proposed method has two steps. One is to estimate sub-pixel disparities using a stripe-based accurate disparity (S-BAD) method. The other is to estimate and track the position and velocity of the obstacle with an IPM-based extended Kalman filter (EKF). The S-BAD method estimates accurate sub-pixel disparities with stripe-based zero-mean normalized cross correlation (ZNCC) using the vertical edge features within the dominant maximum disparity region that correspond to the nearest points from the host vehicles. The S-BAD method has the advantage of minimizing the quantization error and matching ambiguity and enhancing the precision of the disparity for the obstacle. The IPM-based EKF minimizes the error covariance and estimates the relative velocity while predicting and updating the state of the obstacle recursively. The method also gives optimal performance due to the measurement model with the stable error covariance. The experimental results show that the S-BAD method improves the precision of estimating the distance, and the IPM-based EKF minimizes the error covariance of the velocity in real road environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.