Abstract

In music, the perception of pitch is governed largely by its tonal function given the preceding harmonic structure of the music. While behavioral research has advanced our understanding of the perceptual representation of musical pitch, relatively little is known about its representational structure in the brain. Using Magnetoencephalography (MEG), we recorded evoked neural responses to different tones presented within a tonal context. Multivariate Pattern Analysis (MVPA) was applied to “decode” the stimulus that listeners heard based on the underlying neural activity. We then characterized the structure of the brain’s representation using decoding accuracy as a proxy for representational distance, and compared this structure to several well established perceptual and acoustic models. The observed neural representation was best accounted for by a model based on the Standard Tonal Hierarchy, whereby differences in the neural encoding of musical pitches correspond to their differences in perceived stability. By confirming that perceptual differences honor those in the underlying neuronal population coding, our results provide a crucial link in understanding the cognitive foundations of musical pitch across psychological and neural domains.

Highlights

  • IntroductionTwo physically identical tones heard in different contexts may bear little resemblance

  • By examining distinctions in the brain’s response to various pitches of differing tonal function, the current study provided a neural analogue to prior psychological models and evaluated their specific predictions: how distinctly does the brain represent each pitch-class relative to one another, and well do these neural distinctions align with perceptual differences between musical pitches?

  • The accuracy with which the classifier could discriminate between neural responses for a given pair of tones provided a measure of their neural representational distance; and the complete set of pairwise distances defined the geometry of the stimulus’ representational structure in the brain[22,23]

Read more

Summary

Introduction

Two physically identical tones heard in different contexts may bear little resemblance This distinction arises because the tonality or key of the musical context assigns a unique function to each pitch[1]. Recording EEG from trained musicians, Krohn et al.[5] found that the amplitude of pitch-evoked response components were modulated based on the perceptual stability of the evoking pitch-class, suggesting a stored representation of hierarchical pitch structure in cortex. While these studies flag the presence of tonal-schematic processing, relatively little is known about the explicit representational structure of musical pitch in the cortex. The observed neural representation was best accounted for by a model derived from the Standard Tonal Hierarchy[20], indicating that differences in the neural encoding of musical pitch correspond to differences in their perceived stability

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call