Abstract

A novel approach of tracking objects in image sequences is proposed, which can both adapt to variation of objects’ features, and jointly determine objects’ position and scale provided that their typical appearances are available prior to tracking. Most classical algorithms fail in the presence of an object’s significant simultaneous variation of appearance, scale and surrounding illumination, because the features they employ are obtained from object’s region in the frame where the tracking is initiated and ignore the deviation that happens during tracking. In this paper, we propose to extract object’s features from its typical appearances in reference frames and represent them as the coefficients of an BP neural network. Due to the BP network’s strong ability of nonlinear classification, this new approach can discriminate the object from background despite its transitions between appearances during tracking. We also propose to determine object’s scale by performing scale space blob detection in the particle filtering framework, which overcomes the drawbacks of falling into local minima and enhances the tracker’s robustness. Experimental results validate the effectiveness of the proposed approach, and demonstrate its improved performance in tracking precision and robustness.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.