Abstract

Event Abstract Back to Event Re-assesssing the pre-attentive nature of integrating emotional faces and voices: an event-related potential (ERP) study Tam Ho1*, Sonja Kotz2, 3 and Jeesun Kim4 1 Max Planck Institute for Human Cognitive and Brain Sciences, Neuropsychology, Germany 2 Max Planck Institute for Human Cognitive and Brain Sciences, Germany 3 University of Manchester, School of Psychology, UK 4 MARCS Institute, Australia Research on emotional face-voice integration has been predominated by the hypothesis that facial and vocal emotional information interacts pre-attentively. We investigated this hypothesis using event-related potentials (ERP). Twenty-nine participants (15 female, 20-35 years old) were presented with congruent and incongruent combinations of angry and neutral facial and vocal expressions in an oddball paradigm. In 2 out of 4 blocks (1 block: angry voice; 1 block: neutral voice), participants encountered congruent and incongruent combinations as standards (~80%) and deviants (~10%), respectively. Standards and deviants were then switched in the two remaining blocks. Participants were tested in 2 consecutive sessions. In Session 1, they watched the videos passively. In Session 2, they were instructed to detect and respond when lip movement and voice onset were out of sync (that occurred in ~10% trials). Auditory evoked potentials elicited by deviants were inspected for a mismatch negativity (MMN) - an ERP component associated with pre-attentive deviance detection. Our results indicate that no MMN was elicited in either passive or active condition. Instead, we found effects of emotional face-voice incongruity in the auditory N1 and P2. This suggests that facial and vocal emotional information interacted early. The interaction was most robust in the passive condition when deviant stimuli captured involuntary attention. However, it weakened in the active condition, possibly due to task demand. Therefore, our finding refutes the above hypothesis. It shows that even early interactions of facial and vocal emotional information requires some degree of attention. Keywords: Attention, Face, Voice, emotion, mismatch negativity (MMN), multisensory integration, event-related potential (ERP) Conference: XII International Conference on Cognitive Neuroscience (ICON-XII), Brisbane, Queensland, Australia, 27 Jul - 31 Jul, 2014. Presentation Type: Oral Presentation Topic: Emotional and Social Processes Citation: Ho T, Kotz S and Kim J (2015). Re-assesssing the pre-attentive nature of integrating emotional faces and voices: an event-related potential (ERP) study. Conference Abstract: XII International Conference on Cognitive Neuroscience (ICON-XII). doi: 10.3389/conf.fnhum.2015.217.00031 Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters. The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated. Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed. For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions. Received: 19 Feb 2015; Published Online: 24 Apr 2015. * Correspondence: Ms. Tam Ho, Max Planck Institute for Human Cognitive and Brain Sciences, Neuropsychology, Leipzig, Germany, htho@cbs.mpg.de Login Required This action requires you to be registered with Frontiers and logged in. To register or login click here. Abstract Info Abstract The Authors in Frontiers Tam Ho Sonja Kotz Jeesun Kim Google Tam Ho Sonja Kotz Jeesun Kim Google Scholar Tam Ho Sonja Kotz Jeesun Kim PubMed Tam Ho Sonja Kotz Jeesun Kim Related Article in Frontiers Google Scholar PubMed Abstract Close Back to top Javascript is disabled. Please enable Javascript in your browser settings in order to see all the content on this page.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call