Abstract

Social communication relies on the integration of auditory and visual information, which are present in faces and vocalizations. Evidence suggests that the integration of information from multiple sources enhances perception compared with the processing of a unimodal stimulus. Our previous studies demonstrated that single neurons in the ventrolateral prefrontal cortex (VLPFC) of the rhesus monkey (Macaca mulatta) respond to and integrate conspecific vocalizations and their accompanying facial gestures. We were therefore interested in how VLPFC neurons respond differentially to matching (congruent) and mismatching (incongruent) faces and vocalizations. We recorded VLPFC neurons during the presentation of movies with congruent or incongruent species-specific facial gestures and vocalizations as well as their unimodal components. Recordings showed that while many VLPFC units are multisensory and respond to faces, vocalizations, or their combination, a subset of neurons showed a significant change in neuronal activity in response to incongruent versus congruent vocalization movies. Among these neurons, we typically observed incongruent suppression during the early stimulus period and incongruent enhancement during the late stimulus period. Incongruent-responsive VLPFC neurons were both bimodal and nonlinear multisensory, fostering their ability to respond to changes in either modality of a face-vocalization stimulus. These results demonstrate that ventral prefrontal neurons respond to changes in either modality of an audiovisual stimulus, which is important in identity processing and for the integration of multisensory communication information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call