Abstract

Abstract Perceptible visual tracking acts as an important module for distinct perception tasks of autonomous robots. Better features help in easier decision-making process. The evaluation of tracking objects, dynamic positions and their visual information in results are quite difficult tasks. Until now, most real-time visual tracking algorithms suffer from poor robustness and low occurrence as they deal with complex real-world data. In this paper, we have proposed more robust and faster visual tracking framework using scale invariant feature transform (SIFT) and the optical flow in belief propagation (BF) algorithm for efficient processing in real scenarios. Here, a new feature-based optical flow along with BF algorithm is utilized to compute the affine matrix of a regional center on SIFT key points in frames. Experimental results depict that the proposed approach is more efficient and more robust in comparison with the state-of-the-art tracking algorithms with more complex scenarios.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.