Abstract

Abstract We investigated a method to detect genuine smiles from observers’ physiological states. We recorded two physiological measures from people observing videos of smiles: pupillary response (PR) and galvanic skin response (GSR). Smile videos were from two benchmark databases (MAHNOB and AFEW). MAHNOB videos were classified as showing genuine or real smiles and AFEW videos were classified as not showing real smiles, based on their process of elicitation. A leave-one-observer-out procedure was employed to investigate classification performance using k-nearest neighbor (KNN), support vector machine (SVM), simple neural network (NN), and ensemble classifiers. Different noise removal techniques and a feature selection method — canonical correlation analysis with neural network (NCCA) — were applied to find minimally correlated features for the classes. Using these methods, the highest classification accuracy of 97.8% for PR and 96.6% for GSR signals were found via the ensemble classifier. In comparison, the observers (n = 20) correctly judged smiles as real only 58.9% of the time (on average) to 68.4% (by voting), which is similar to the literature, showing our data is similar in quality. Overall, our results demonstrate that user-independent analyses of physiological measures can substantially outperform individual self-reports for detecting real smiles.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.