Abstract

Knowledge about environmental objects derives from representations of multiple object features both within and across sensory modalities. While our understanding of the neural basis for visual object representation in the human and nonhuman primate brain is well advanced, a similar understanding of auditory objects is in its infancy. We used a name verification task and functional magnetic resonance imaging (fMRI) to characterize the neural circuits that are activated as human subjects match visually presented words with either simultaneously presented pictures or environmental sounds. The difficulty of the matching judgment was manipulated by varying the level of semantic detail at which the words and objects were compared. We found that blood oxygen level dependent (BOLD) signal was modulated in ventral and dorsal regions of the inferior frontal gyrus of both hemispheres during auditory and visual object categorization, potentially implicating these areas as sites for integrating polymodal object representations with concepts in semantic memory. As expected, BOLD signal increases in the fusiform gyrus varied with the semantic level of object categorization, though this effect was weak and restricted to the left hemisphere in the case of auditory objects.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call