Abstract
In vision-aided navigation, images from natural scenes are processed by a chain of complicated algorithms from feature extraction, feature matching, to frame-to-frame tracking to produce vision measurements for navigation aiding. Such vision measurements are subject to non-Gaussian errors and in particular outliers, which, if not properly accounted for, lead to inconsistent, usually optimistic, estimation. The Assured Vision-Aided Inertial Localization (AVAIL) mechanization is recently set forth, which augments the inertial navigation with the probabilistic data association filtering (PDAF) that adaptively computes the probability that an outlier is undetected and weights vision measurements accordingly. Based on a rather general assumption about the outlier distribution, the AVAIL mechanization is shown to be consistent in the presence of real-world image processing errors. In this paper, we study feature matching errors and especially the circumstances in which outliers occur. We do this by checking the extracted features and detected outliers against the original images in the real-world environments so as to verify spatial and temporal assumptions about the feature errors and their distributions. The study shows evidences that lend a strong support to the AVAIL mechanization.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.