Abstract

Background: Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human–Computer retroactions between physiological measures and the virtual agent.Objectives: The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions.Results: Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants.Conclusion: Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain–Computer Interface studies with feedback–feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

Highlights

  • Recognizing emotions expressed non-verbally by others is crucial for harmonious interpersonal exchanges

  • Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties

  • Brain–Computer Interface studies with feedback–feedforward interactions based on facial emotion expressions can be conducted with these stimuli

Read more

Summary

Introduction

Recognizing emotions expressed non-verbally by others is crucial for harmonious interpersonal exchanges. Presentations of photographs of real faces allowed the classic discovery that humans are generally able to correctly perceive six fundamental emotions (happiness, surprise, fear, sadness, anger, and disgust) experienced by others from their facial expressions (Ekman and Oster, 1979). These stimuli helped documenting social cognition impairment in neuropsychiatric disorders such as autism (e.g., Dapretto et al, 2006), schizophrenia (e.g., Kohler et al, 2010), and psychopathy (Deeley et al, 2006). Virtual faces allow real-time Human–Computer retroactions between physiological measures and the virtual agent

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.