Abstract
Visual symbols or events may provide predictive information on to-be-expected sound events. When the perceived sound does not confirm the visual prediction, the incongruency response (IR), a prediction error signal of the event-related brain potentials, is elicited. It is unclear whether predictions are derived from lower-level local contingencies (e.g., recent events or repetitions) or from higher-level global rules applied top-down. In a recent study, sound pitch was predicted by a preceding note symbol. IR elicitation was confined to the condition where one of two sounds was presented more frequently and was not present with equal probability of both sounds. These findings suggest that local repetitions support predictive cross-modal processing. On the other hand, IR has also been observed with equal stimulus probabilities, where visual patterns predicted the upcoming sound sequence. This suggests the application of global rules. Here, we investigated the influence of stimulus repetition on the elicitation of the IR by presenting identical trial trains of a particular visual note symbol cueing a particular sound resulting either in a congruent or an incongruent pair. Trains of four different lengths: 1, 2, 4, or 7 were presented. The IR was observed already after a single presentation of a congruent visual-cue-sound combination and did not change in amplitude as trial train length increased. We conclude that higher-level associations applied in a top-down manner are involved in elicitation of the prediction error signal reflected by the IR, independent from local contingencies.
Highlights
Visual information may influence the processing of auditory information, as illustrated by phenomena like the ventriloquist illusion (e.g., Alais & Burr, 2004), the McGurk effect (McGurk & MacDonald, 1976), or cross-m odal spatial attention effects (e.g., Eimer & Driver, 2001)
The sighting of an approaching dog might induce the expectation of a barking sound, and after seeing a flash in the sky the sound of thunder would be expected. The existence of such visually induced auditory predictions is often probed in experiments by the presentation of sounds being incongruent to the prediction, which results in the elicitation of prediction error signals in the event-related brain potential (ERP)
Based on the existing literature, we assumed that preceding visual information of the upcoming auditory stimulus would lead to visual-b ased sensory predictions
Summary
Visual information may influence the processing of auditory information, as illustrated by phenomena like the ventriloquist illusion (e.g., Alais & Burr, 2004), the McGurk effect (McGurk & MacDonald, 1976), or cross-m odal spatial attention effects (e.g., Eimer & Driver, 2001). Widmann et al (2004) observed an IR with equal distribution of high and low sounds, suggesting that visual– auditory associations are the result of learned higher-level associations that are predicted as soon as the visual stimulus is presented (i.e., without the need for local repetition). Given these contradicting findings (Stuckenberg et al, 2019 vs Widmann et al, 2004) and the literature on short-latency components (Wacongne et al, 2011), the present study aims to better understand the role of local repetition on the elicitation of the IR. The N2-P 3 response is expected for all conditions in the current experiment
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.