Abstract
Color-based particle filters have emerged as an appealing method for targets tracking. As the target may undergo rapid and significant appearance changes, the template (i.e. scale of the target, color distribution histogram) also needs to be updated. Traditional updates without learning contextual information may imply a high risk of distorting the model and losing the target. In this paper, a new algorithm utilizing the environmental information to update both the scale of the tracker and the reference appearance model for the purpose of object tracking in video sequences has been put forward. The proposal makes use of the well-established color-based particle filter tracking while differentiating the foreground and background particles according to their matching score. A roaming phenomenon that yields the estimation to shrink and diverge is investigated. The proposed solution is tested using both simulated and publicly available benchmark datasets where a comparison with six state-of-the-art trackers has been carried out. The results demonstrate the feasibility of the proposal and lie down foundations for further research on tackling complex visual tracking problems.
Highlights
With the widespread of multimedia standards like MPEG-4, large amount of surveillance cameras have been deployed in cities, public buildings, motorways, etc
This resulted in a substantial increase of multimedia data, which, in turn, rendered the task of robust automatic tracking system from video sequences of paramount importance due to increasing labour cost
Colour feature is acknowledged for its computational efficiency, invariance to scale and resolution change as well as robustness in handling partial occlusion scenarios as pointed out in [8], which justifies its wide use in object tracking problems
Summary
We analyze the results of the proposed algorithm using both simulated videos and publicly available datasets [51]. Calculate the colour histogram for Weight each particles {ω(ki)} using each (6); sample of set{s(ki). (a) Estimate the mean state of the set Sk by Eq 8; (b) Resample as usual (i.e. systematic Eq 10); If shrinking (a) Modify the tracker scale size by Eq 20–23; (b) Calculate the new weight of the modified estimate using (24) and (6), and apply resampling; 6. The simulated environment will be used in order to tune the parameters involved in our adaptive scale and appearance model up date. The performance of our tracker has been tested using real scenarios from benchmark datasets involving significant scale changes and appearance variation of the target. To highlight the significant improvement using our appearance adaption method, the performances are compared with no updating method and alternative updating method employed in [14] where: hk = (1 − μ)hk− 1 + μhobs (30). The implementation is performed using Matlab programming lan guage on a standard PC configuration machine with Intel® CoreTM i710700 K machine of 16 GB RAM and 1000 GB disk
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Visual Communication and Image Representation
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.