Abstract

The detection of a human face in a visual field and correct reading of emotional expression of faces are important elements in everyday social interactions, decision making and emotional responses. Although brain correlates of face processing have been established in previous fMRI and electroencephalography (EEG)/MEG studies, little is known about how the brain representation of faces and emotional expressions of faces in freely moving humans. The present study aimed to detect brain electrical potentials that occur during the viewing of human faces in natural settings. 64-channel wireless EEG and eye-tracking data were recorded in 19 participants while they moved in a mock art gallery and stopped at times to evaluate pictures hung on the walls. Positive, negative and neutral valence pictures of objects and human faces were displayed. The time instants in which pictures first occurred in the visual field were identified in eye-tracking data and used to reconstruct the triggers in continuous EEG data after synchronizing the time axes of the EEG and eye-tracking device. EEG data showed a clear face-related event-related potential (ERP) in the latency interval ranging from 165 to 210 ms (N170); this component was not seen whilst participants were viewing non-living objects. The face ERP component was stronger during viewing disgusted compared to neutral faces. Source dipole analysis revealed an equivalent current dipole in the right fusiform gyrus (BA37) accounting for N170 potential. Our study demonstrates for the first time the possibility of recording brain responses to human faces and emotional expressions in natural settings. This finding opens new possibilities for clinical, developmental, social, forensic, or marketing research in which information about face processing is of importance.

Highlights

  • Facial expressions are evolutionarily based and culturally conditioned tools

  • While earlier studies reported a lack of encoding of emotional facial expression by the N170 potential (Herrmann et al, 2002; Eimer et al, 2003), a recent meta-analysis confirmed the encoding of emotional facial expressions in the amplitudes of the N170 potential (Hinojosa et al, 2015)

  • To the best of our knowledge, this study is the first to demonstrate the presence of a face-sensitive scalp potential during viewing of human faces in natural settings

Read more

Summary

Introduction

Facial expressions are evolutionarily based and culturally conditioned tools. They steer social interactions, solicit help and inform about events in social environments as well as the intentions of the expresser (Matsumoto et al, 2008). Previous brain imaging studies have shown that a set of brain regions in occipitotemporal cortex were associated with processing human faces (Kanwisher et al, 1997; Haxby et al, 2002). Electroencephalographic event-related potentials (ERPs) revealed a negative potential, N170, at lateral occipitotemporal regions of the scalp which responded with greater amplitude when viewing faces compared to objects (Bentin et al, 1996). While earlier studies reported a lack of encoding of emotional facial expression by the N170 potential (Herrmann et al, 2002; Eimer et al, 2003), a recent meta-analysis confirmed the encoding of emotional facial expressions in the amplitudes of the N170 potential (Hinojosa et al, 2015)

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call