Abstract
Facial expressions represent an important part of non-verbal communication used in everyday life. The N170 is widely regarded as a face-sensitive potential and has been linked to facial structural encoding, however it remains debated whether the N170 is modulated by facial expressions of emotion. We investigated how attention to facial features affects the early stages of emotion perception during an implicit emotion processing task. ERPs were recorded in response to presentations of fearful, joyful, or neutral facial expressions while fixation was restricted to the left eye, right eye, nose, or mouth using an eye tracker. Participants’ task was to discriminate the face gender. Enhanced N170 amplitudes and longer latencies were found when participants were fixated on the left and right eyes compared to the mouth and nose irrespective of emotion. Importantly, the N170 was not modulated by emotion. The results support the view that the N170 component is not sensitive to the facial expression in an implicit emotional task. In contrast, which feature is fixated modulates this component. As the eyes have been shown to be the diagnostic feature used to correctly categorize face gender, it could be that attention to the diagnostic feature is what drives N170 modulation with emotion in previous studies not controlling for fixation. This idea is currently being tested using an explicit emotional task with the same stimuli. Meeting abstract presented at VSS 2013
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.