Abstract

Results of several experiments aimed at comparing language and music processing, using the Event-Related brain Potentials (ERPs) method, are reviewed. By recording the changes in the brain electrical activity time-locked to the event of interest, it is possible to precisely study the time-course of the processes involved in language comprehension and music perception. Across experiments, different levels of linguistic and musical processing are compared: the semantic, syntactic and temporal levels in language and the melodic, harmonic and rhythmic levels in music. Overall, results point to strong differences between language and music when semantic processing in language is compared with both melodic and harmonic processing in music. In contrast, strong similarities emerge from the comparisons between the syntactic and harmonic aspects, as well as from the role played by temporal information in both language and music.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call