Abstract
In this paper we propose a robust edge-based approach for 3D textureless object tracking. We first introduce an edge-based pose estimation method, which minimizes the holistic distance between the projected object contour and the query image edges, without explicitly searching for 3D-2D correspondences. This method is accurate with a good initialization; however, it is sensitive to occlusion and fast motion, thus often gets lost in real environments. To improve robustness, we exploit consistency of edge direction for validating the correctness of the estimated 3D pose, and further incorporate the validation scheme for robust estimation, non-local searching and failure recovery. The robust estimation adopts point-wise validation to reduce the effect of outlier, resulting in a direction-based robust estimator. The non-local searching is based on particle filter, with the pose validation for a faithful weighting of particles, which is shown to be better than the distance-based weighting. The failure recovery is based on fast 2D detection, and estimates the recovered pose by searching for 3D-2D point correspondences, with the validation scheme to adaptively determine state transition. The effectiveness of our approach is demonstrated using comparative experiments on real image sequences with occlusions, large motions and background clutters.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.