Abstract

Emotions of human beings are largely represented by facial expressions. Facial expressions, simple as well as complex, are well decoded by facial action units. Any facial expression can be detected and analyzed if facial action units are decoded well. In the presented work, an attempt has been made to detect facial action unit intensity by mapping the features based on their cosine similarity. Distance metric learning based on cosine similarity maps the data by learning a metric that measures orientation rather than magnitude. The motivation behind using cosine similarity is that change in facial expressions can be better represented by changes in orientation as compared to the magnitude. The features are applied to support vector machine for classification of various intensities of action units. Experimental results on the popularly accepted database such as DISFA database and UNBC McMaster shoulder pain database confirm the efficacy of the proposed approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call