Abstract

Image tracking provides crucial insight for the image motion, which generates essential information for incremental structure-from-motion reconstruction and camera pose estimation. Typical usages, such as 3D reconstruction and visual odometry, all rely on robust and accurate local feature tracking through consecutive images. Current algorithms realize feature tracking through matching features extracted from discriminant textures in the images, for which distinctive image content is required to obtain accurate feature matching. For images with few textures, usually, an insufficient number of features are extracted to perform reliable tracking in a series of sequential images. We propose a method that makes use of a limited number of discriminate features to explore other features without strong discriminant power. We develop a feature integrating surrounding salient points distribution knowledge, raw pixel value, and coordinate information to discover a significant amount of features in weakly textured areas in an image. We also incorporate epipolar geometry in the feature correspondence calculation by taking the distance from the matching candidate to its corresponding point's epipolar line into account. To reduce the number of unreliable features, we project the estimated 3D points back to the images. The reprojection error is standardized according to the 3D point's depth, which reduces the bias introduced by the object distance to the camera. We conduct experiments on a large dataset of Arctic sea ice images, mainly composed by planes of ices and sea water. The experimental results demonstrate that our method can perform fast and accurate tracking in weakly textured images.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.