Abstract

In this paper, we describe a local-neighborhood pixel-based adaptive algorithm to track image features, both spatially and temporally, over a sequence of monocular images. The algorithm assumes no a priori knowledge about the image features to be tracked, or the relative motion between the camera and the three dimensional (3D) objects. The features to be tracked are selected by the algorithm and they correspond to the peaks of a ‘ correlation surface’ constructed from a local neighborhood in the first image of the sequence to be analysed. Any kind of motion, i.e., 6 DOF (translation and rotation), can be tolerated keeping in mind the pixels-per-frame motion limitations. No subpixel computations being necessary. Taking into account constraints of temporal continuity, the algorithm uses simple and efficient predictive tracking over multiple frames. Trajectories of features on multiple objects can also be computed. The algorithm accepts a slow, continuous change of brightness D.C. level in the pixels of the feature. Another important aspect of the algorithm is the use of an adaptive feature matching threshold that accounts for change in relative brightness of neighboring pixels. As applications of the feature tracking algorithm, and to test the accuracy of the tracking, we show how the algorithm has been used to extract the Focus of Expansion (FOE) and to compute the time-to-contact using real image sequences of unstructured, unknown environments. In both applications information from multiple frames is used.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.