Abstract

Although certain parallels can be drawn between written language and notation in music — both use arbitrary visual symbols to notate the salient aspects of a sound pattern, the purpose of each notational system differs markedly. While the primary function of written language is to carry referential meaning, the primary function of musical notation is to carry instructions for the production of a musical performance. Music reading thus lies at the interface between perception and action and provides an ecological model with which to study how visual instructions influence the motor system. The studies presented in this article investigate how musical symbols on the page are decoded into a musical response, from both a cognitive and neurological perspective. The results of a musical Stroop paradigm are described, in which musical notation was present but irrelevant for task performance. The presence of musical notation produced systematic effects on reaction time, demonstrating that reading of the written note, as well as the written word, is obligatory for those who are musically literate. Spatial interference tasks are also described which suggest that music reading, at least for the pianist, can be characterized as a set of vertical to horizontal mappings. These behavioural findings are mirrored by the results of an fMRI training study in which musically untrained adults were taught to read music and play piano keyboard over a period of three months. Learning-specific changes were seen in superior parietal cortex and supramarginal gyrus, areas which are known to be involved in spatial sensorimotor transformations and preparation of learned actions respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call