Abstract

AbstractWe propose a new method for analyzing the dynamics of facial expressions to identify persons using Active Appearance Models and accurate facial feature point tracking. Several methods have been proposed to identify persons using facial images. In most methods, variations in facial expressions are one trouble factor. However, the dynamics of facial expressions are one measure of personal characteristics. In the proposed method, facial feature points are automatically extracted using Active Appearance Models in the first frame of each video. They are then tracked using the Lucas-Kanade based feature point tracking method. Next, a temporal interval is extracted from the beginning time to the ending time of facial expression changes. Finally, a feature vector is obtained. In the identification phase, an input feature vector is classified by calculating the distance between the input vector and the training vectors using dynamic programming matching. We show the effectiveness of the proposed method using smile videos from the MMI Facial Expression Database.Keywordsfacial expression analysisAAMsLK-based feature point trackingDP matchingperson identification

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.