Abstract

Customer satisfaction can be measured using facial expression recognition. The current generation of artificial intelligence systems heavily depends on facial features such as eyebrows, eyes, and foreheads. This dependence introduces a limitation as people generally prefer to conceal their genuine emotions. As body gestures are difficult to conceal and can convey a more detailed and accurate emotional state, the authors incorporate upper-body gestures as an additional feature that improves the predicted emotion's accuracy. This work uses an ensemble machine-learning model that integrates support vector machines, random forest classifiers, and logistic regression classifiers. The proposed method detects emotions from facial expressions and upper-body movements and is experimentally evaluated and has been found to be effective, with an accuracy rate of 97% on the EMOTIC dataset and 99% accuracy on MELD dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call