Abstract

While the relationship between facial expressions and emotion has been a productive area of inquiry, research is only recently exploring whether a link exists between facial expressions and cognitive processes. Using findings from psychology and neuroscience to guide predictions of affectation during a cognitive task, this article aimed to study facial dynamics as a mean to understand comprehension. We present a new multimodal facial expression database, named Facial Expressions of Comprehension (FEC), consisting of the videos recorded during a computer-mediated task in which each trial consisted of reading, answering, and feedback to general knowledge true and false statements. To identify the level of engagement with the corresponding stimuli, we present a new methodology using animation units (AnUs) from the Kinect v2 device to explore the changes in facial configuration caused by an event: Event-Related Intensities (ERIs). To identify dynamic facial configurations, we used ERIs in statistical analyses with generalized additive models. To identify differential facial dynamics linked to knowing vs. guessing and true vs. false responses, we employed an SVM classifier with facial appearance information extracted using LPQ-TOP. Results of ERIs in sentence comprehension show that facial dynamics are promising to help understand affective and cognitive states of the mind.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.