Abstract

The concept of two largely independent systems, with strict left hemisphere lateralization of language and predominantly right lateralization of music is being challenged by the alternative view that language and music are closely related cognitive and neural systems with complex constellations of sub-processes, some of which are shared, and others that are not. Neurophysiologic data demonstrating similar syntax and semantics processing together with similarities in the development of the two domains in the infant brain support that language and music have much in common and complement each other. Close interaction between the two hemispheres is needed for optimal functioning of both language and music. Thus, the right hemisphere has an important role for understanding complex natural language such as stories and metaphors. Learning to read, write and musical training induces functional and anatomical changes in functionally relevant connections, and modifies hemispheric asymmetries for specific functions. Comparative research on music and language provides a way to study basic brain mechanisms and how the brain transfers acoustic stimuli into the unique human abilities for language and music, and may help bridge the divide between the sciences and the humanities.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call