Abstract
To solve the problem of tracking the trajectory of a moving object and learning a deep compact image representation in the complex environment, a novel robust incremental deep learning tracker is presented under the particle filter framework. The incremental deep classification neural network was composed of stacked denoising autoencoder, incremental feature learning and support vector machine to achieve the featureextracting and classification of particle set. Deep learning is successfully taken to express the image representations obtained effectively. Unsupervised feature learning is used to learn generic image features and transfer learning transforms knowledge from offline training to the online tracking process. The incremental feature learning was consisted of adding features and merging features to online learn compact feature set. Linear support vector machine increases the discretion for target with similar appearance and is further tuned to adapt to appearance changes of the moving object. Compared with the state-of-the-art trackers in the complex environment, the results of experiments on variant challenging image sequences show that incremental deep learning tracker solves the problem of existent trackers more efficiently, it has better robust and more accurate, especially for occlusions, background clutter, illumination changes and appearance changes.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Signal Processing, Image Processing and Pattern Recognition
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.