Abstract
BackgroundNumerous previous experiments have used oddball paradigm to study change detection. This paradigm is applied here to study change detection of facial expressions in a context which demands abstraction of the emotional expression-related facial features among other changing facial features.MethodsEvent-related potentials (ERPs) were recorded in adult humans engaged in a demanding auditory task. In an oddball paradigm, repeated pictures of faces with a neutral expression ('standard', p = .9) were rarely replaced by pictures with a fearful ('fearful deviant', p = .05) or happy ('happy deviant', p = .05) expression. Importantly, facial identities changed from picture to picture. Thus, change detection required abstraction of facial expression from changes in several low-level visual features.ResultsERPs to both types of deviants differed from those to standards. At occipital electrode sites, ERPs to deviants were more negative than ERPs to standards at 150–180 ms and 280–320 ms post-stimulus. A positive shift to deviants at fronto-central electrode sites in the analysis window of 130–170 ms post-stimulus was also found. Waveform analysis computed as point-wise comparisons between the amplitudes elicited by standards and deviants revealed that the occipital negativity emerged earlier to happy deviants than to fearful deviants (after 140 ms versus 160 ms post-stimulus, respectively). In turn, the anterior positivity was earlier to fearful deviants than to happy deviants (110 ms versus 120 ms post-stimulus, respectively).ConclusionERP amplitude differences between emotional and neutral expressions indicated pre-attentive change detection of facial expressions among neutral faces. The posterior negative difference at 150–180 ms latency resembled visual mismatch negativity (vMMN) – an index of pre-attentive change detection previously studied only to changes in low-level features in vision. The positive anterior difference in ERPs at 130–170 ms post-stimulus probably indexed pre-attentive attention orienting towards emotionally significant changes. The results show that the human brain can abstract emotion related features of faces while engaged to a demanding task in another sensory modality.
Highlights
Numerous previous experiments have used oddball paradigm to study change detection
The positive anterior difference in Event-related potentials (ERP) at 130–170 ms post-stimulus probably indexed pre-attentive attention orienting towards emotionally significant changes
ERP-latencies In order to analyze possible differences in the time course of the visual mismatch negativity (vMMN) for the fearful and happy deviants, the ERPs to them were compared to the ERPs to the respective standards preceding them by point-wise paired t-tests (Table 1). These analyses revealed that, at the occipital sites, ERPs to happy deviants differed from ERPs to standards earlier than ERPs to fearful deviants differed from ERPs to standards
Summary
Numerous previous experiments have used oddball paradigm to study change detection. This paradigm is applied here to study change detection of facial expressions in a context which demands abstraction of the emotional expression-related facial features among other changing facial features. In a multitude of psychophysiological studies, change detection has been explored by recording event-related potentials (ERPs) to serially presented stimuli in a so called oddball paradigm. In this condition, frequently presented (standard) stimuli are randomly replaced by infrequent (deviant) ones that differ from them in one or more aspects. Frequently presented (standard) stimuli are randomly replaced by infrequent (deviant) ones that differ from them in one or more aspects In audition, these changes elicit the mismatch negativity (MMN) component at 100– 200 ms from stimulus onset, even if the subjects are not attending to the stimulation but concentrating on another task P3 is modality non-specific and has been observed in response to infrequent (unattended) deviant stimuli (P3a, e.g. [3]) and to target stimuli (P3b, e.g. [4])
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have