Abstract

Numerous studies have demonstrated that the structural and functional differences between professional musicians and non-musicians are not only found within a single modality, but also with regard to multisensory integration. In this study we have combined psychophysical with neurophysiological measurements investigating the processing of non-musical, synchronous or various levels of asynchronous audiovisual events. We hypothesize that long-term multisensory experience alters temporal audiovisual processing already at a non-musical stage. Behaviorally, musicians scored significantly better than non-musicians in judging whether the auditory and visual stimuli were synchronous or asynchronous. At the neural level, the statistical analysis for the audiovisual asynchronous response revealed three clusters of activations including the ACC and the SFG and two bilaterally located activations in IFG and STG in both groups. Musicians, in comparison to the non-musicians, responded to synchronous audiovisual events with enhanced neuronal activity in a broad left posterior temporal region that covers the STG, the insula and the Postcentral Gyrus. Musicians also showed significantly greater activation in the left Cerebellum, when confronted with an audiovisual asynchrony. Taken together, our MEG results form a strong indication that long-term musical training alters the basic audiovisual temporal processing already in an early stage (direct after the auditory N1 wave), while the psychophysical results indicate that musical training may also provide behavioral benefits in the accuracy of the estimates regarding the timing of audiovisual events.

Highlights

  • Multisensory events, such as watching and listening to an opera or a concert, are mostly perceptually integrated and recognized as having synchronous audiovisual information even when perceived from a distance

  • Apart from pitch and dynamics, precise timing is among the greatest challenges in orchestral music making

  • Activities on the left side were located on the left Superior Temporal Gyrus (STG) (peak coordinates: x = 244, y = 22, z = 226; cluster size = 1433 voxels; t (23) = 4.77; p,0.05 AlphaSim corrected) and Inferior Frontal Gyrus (IFG)

Read more

Summary

Introduction

Multisensory events, such as watching and listening to an opera or a concert, are mostly perceptually integrated and recognized as having synchronous audiovisual information even when perceived from a distance. Sound travels much slower than light in the air, and the visual and auditory information of a distant event are asynchronous. This tolerance in recognizing the timing differences of multisensory events helps us to avoid focusing unnecessary attention to this phenomenon in daily perception. Orchestral musicians rely more on advanced multimodal skills They have to coordinate and integrate their motor actions with visual, auditory and proprioceptive feedback from their own instrument and from the musical score, but they have to attend to and synchronize their actions with those of their fellow musicians (using visual and auditory information) and with the conductor’s gestures (visual) as well. In comparison to nonmusicians, they have pronounced auditory cortical representations for tones of the musical scale [12,13,14,15,16], superior ability for musical imagery [17], enhanced cortical representation for musical timbre [18] and increased sensorimotor responses [19,20]

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call