Abstract

Automatic facial expression analysis is an important aspect of human machine interaction as the face is an important communicative medium. We use our face to signal interest, disagreement, intentions or mood through subtle facial motions and expressions. Work on automatic facial expression analysis can roughly be divided into the recognition of prototypic facial expressions such as the six basic emotional states and the recognition of atomic facial muscle actions (action units, AUs). Detection of AUs rather than emotions makes facial expression detection independent of culture-dependent interpretation, reduces the dimensionality of the problem and reduces the amount of training data required. Classic psychological studies suggest that humans consciously map AUs onto the basic emotion categories using a finite number of rules. On the other hand, recent studies suggest that humans recognize emotions unconsciously with a process that is perhaps best modeled by artificial neural networks (ANNs). This paper investigates these two claims. A comparison is made between detection of emotions directly from features vs. a two-step approach where we first detect AUs and use the AUs as input to either a rulebase or an ANN to recognize emotions. The results suggest that the two-step approach is possible with a small loss of accuracy and that biologically inspired classification techniques outperform those that approach the classification problem from a logical perspective, suggesting that biologically inspired classifiers are more suitable for computer-based analysis of facial behavior than logic inspired methods

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call