Abstract

In recent years, facial emotional expression recognition attracts several researchers for developing intelligent human-machine interface (HMI) system. This present work classify six different facial expressions (happiness, sadness, anger, fear, disgust, and surprise) using two classifiers namely, K Nearest Neighbor (KNN) and Decision Tree (DT) classifiers. Fifty-five undergraduate university students (35 male and 20 female) with a mean age of 23.9 years voluntarily participated in the experiment to acquire six facial emotional expressions using ten virtual markers called Facial Action Units (FAUs). Firstly, Haar-like features are used for detecting the face and eyes in a video-frame using Viola-Jones adaboost classification method. These FAUs are placed on specific location on the subject’s face based on facial action coding system (FACS) using a mathematical model. Lucas-Kande optical flow algorithm is used to continuously track the markers positions. Here, the distance between the FAU at the center of the subject face to other markers are calculated and used as a feature for facial expression classification. The one-way analysis of variance with a significance level of p<0.01 is used to validate the extracted features and fivefold cross-validation is performed. Finally, these cross-validated features are used to map six different facial emotional expression using KNN and DT classifiers. In KNN, four distance measures are used to compare the performance of KNN classifier in emotion classification. The mean emotion classification accuracy of 98.03%, 97.21%, is achieved using KNN and DT, respectively. This accuracy of emotional expression detection using an optical flow algorithm give way for designing real-time systems for variety of applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call