Abstract

Visual tracking, in essence, deals with non-stationary data streams that change over time. While most existing algorithms are able to track objects well in controlled environments, they usually fail if there is a significant change in object appearance or surrounding illumination. The reason is that these visual tracking algorithms operate on the premise that the models of the objects being tracked are invariant to internal appearance change or external variation such as lighting or viewpoint. Consequently most tracking algorithms do not update the models once they are built or learned at the outset. In this paper we present a novel algorithm for tracking of human faces in image sequences even in case of total occlusion based on the Gaussian mixture models and modified Particle filter. We find the estimate of the location of the face using the modified particle filter and update the weights of the samples using Gaussian mixture models and Bhattacharyya distance between two Gaussian Distributions. Borne out by experiments, we demonstrate the proposed method is able to track faces well under large lighting and almost perfect occlusion with close to real-time performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.