Abstract
In our everyday lives, we need to process auditory and visual temporal information as efficiently as possible. Although automatic auditory time perception has been widely investigated using an index of the mismatch negativity (MMN), the neural basis of automatic visual time perception has been largely ignored. The present study investigated the automatic processing of auditory and visual time perception employing the cross-modal delayed response oddball paradigm. In the experimental condition, the standard stimulus was 200 ms and the deviant stimulus was 120 ms, which were exchanged in the control condition. Reaction time, accuracy, and event-related potential (ERP) data were measured when participants performed the duration discrimination task. The ERP results showed that the MMN, N2b, and P3 were elicited by an auditory deviant stimulus under the attention condition, while only the MMN was elicited under the inattention condition. The MMN was largest over the frontal and central sites, while the difference in MMN amplitude was not significant between under the attention and inattention condition. In contrast, the change-related positivity (CRP) and the visual mismatch negativity (vMMN) were elicited by the visual deviant stimulus under both the attention and inattention conditions. The CRP was largest over the occipito-temporal sites under the attention condition and over the fronto-central sites under the inattention condition. The difference in CRP amplitude was significant between the attention and inattention condition. The vMMN was largest over the parieto-occipital sites under the attention condition, and largest over the fronto-central sites under the inattention condition. The difference in vMMN amplitude was significant between the attention and inattention condition. Auditory MMN does not appear to be modulated by attention, whereas the visual CRP and the vMMN are modulated by attention. Therefore, the present study provides electrophysiological evidence for the existence of automatic visual time perception and supports an “attentional switch” hypothesis for a modality effect on duration judgments, such that auditory temporal information is processed relatively automatically, whereas visual temporal information processing requires controlled attention.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.