Abstract

This study aimed to examine whether the cortical processing of emotional faces is modulated by the computerization of face stimuli (”avatars”) in a group of 25 healthy participants. Subjects were passively viewing 128 static and dynamic facial expressions of female and male actors and their respective avatars in neutral or fearful conditions. Event-related potentials (ERPs), as well as alpha and theta event-related synchronization and desynchronization (ERD/ERS), were derived from the EEG that was recorded during the task. All ERP features, except for the very early N100, differed in their response to avatar and actor faces. Whereas the N170 showed differences only for the neutral avatar condition, later potentials (N300 and LPP) differed in both emotional conditions (neutral and fear) and the presented agents (actor and avatar). In addition, we found that the avatar faces elicited significantly stronger reactions than the actor face for theta and alpha oscillations. Especially theta EEG frequencies responded specifically to visual emotional stimulation and were revealed to be sensitive to the emotional content of the face, whereas alpha frequency was modulated by all the stimulus types. We can conclude that the computerized avatar faces affect both, ERP components and ERD/ERS and evoke neural effects that are different from the ones elicited by real faces. This was true, although the avatars were replicas of the human faces and contained similar characteristics in their expression.

Highlights

  • Facial expressions of emotion play an important role in human interactions and communication

  • There was no difference in the emotional condition for the avatar, but we could find a significant decrease in N300 amplitudes to both fearful [t = –3.54, p < 0.00] and neutral faces of avatars [t = –6.93, p < 0.00] compared to those to the faces of actors

  • There was no difference in the emotional condition for the avatar, but we could detect a significant increase in Late positive potentials (LPP) for both emotional conditions [fear t = –3.34, p < 0.00, and neutral t = –7.72, p < 0.00] for the avatar compared to the actor

Read more

Summary

Introduction

Facial expressions of emotion play an important role in human interactions and communication. During the past few years, many studies have investigated the role of neuronal mechanisms involved in the processing of emotional faces (Krolak-Salmon et al, 2001; Balconi and Lucchiari, 2006; Moore et al, 2012). The evaluation of these mechanisms associated with the processing of facial expressions necessitates the use of valid stimuli that fully capture the facial and emotion-related. A promising way to bypass this is the use of three-dimensional computer-generated faces, so-called avatar faces, which allow forming and systematically control important features of the facial expression. The motion of the facial expression was tracked with 66 tracking points, to convey the actors’ recorded expressions onto the avatar faces

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call