Abstract

To test the hypothesis that semantic processes are represented in multiple subsystems, we recorded electroencephalogram (EEG) as we elicited object memories using the modified Semantic Object Retrieval Test, during which an object feature, presented as a visual word [VW], an auditory word [AW], or a picture [Pic], was followed by a second feature always presented as a visual word. We performed both hypothesis-driven and data-driven analyses using event-related potentials (ERPs) time locked to the second stimulus. We replicated a previously reported left fronto-temporal ERP effect (750–1000 ms post-stimulus) in the VW task, and also found that this ERP component was only present during object memory retrieval in verbal (VW, AW) as opposed to non-verbal (Pic) stimulus types. We also found a right temporal ERP effect (850–1000 ms post-stimulus) that was present in auditory (AW) but not in visual (VW, Pic) stimulus types. In addition, we found an earlier left temporo-parietal ERP effect between 350 and 700 ms post-stimulus and a later midline parietal ERP effect between 700 and 1100 ms post-stimulus, present in all stimulus types, suggesting common neural mechanisms for object retrieval processes and object activation, respectively. These findings support multiple semantic subsystems that respond to varying stimulus modalities, and argue against an ultimate unitary amodal semantic analysis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call