Abstract

Understanding how the brain incorporates sensory and motor information will enable better theory building on human perception and behavior. In this study, we aimed to estimate the influence of predictive mechanisms on the magnitude and variability of sensory attenuation in two online samples. After the presentation of a visual cue stimulus, participants (Experiment 1: N = 224, Experiment 2: N = 84) compared the loudness of two consecutive tones in a two-alternative forced-choice task. In Experiment 1, the first tone was either self-initiated or not; in Experiment 2, the second tone was either self-initiated or not (active and passive condition, respectively). We further manipulated identity prediction (i.e., the congruence of pre-learned cue-sound combinations; congruent vs. incongruent), and the duration of the onset delay (to account for effects of attentional differences between the passive and active condition, 50 ms vs. 0 ms). We critically discuss our results within the framework of both classical (i.e., motor-based forward models) and contemporary approaches (i.e., predictive processing framework). Contrary to our preregistered hypothesis, we observed enhanced perceptual processing, instead of attenuation, for self-initiated auditory sensory input. Further, our results reveal an effect of fixed sound delays on the processing of motor and non-motor-based predictive information, and may point to according shifts in attention, leading to a perceptual bias. These results might best be captured by a hybrid explanatory model, combining predictions based on self-initiated motor action with a global predictive mechanism.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call