Abstract

Even without formal training, humans experience a wide range of emotions in response to changes in musical features, such as tonality and rhythm, during music listening. While many studies have investigated how isolated elements of tonal and rhythmic properties are processed in the human brain, it remains unclear whether these findings with such controlled stimuli are generalizable to complex stimuli in the real world. In the current study, we present an analytical framework of a linearized encoding analysis based on a set of music information retrieval features to investigate the rapid cortical encoding of tonal and rhythmic hierarchies in natural music. We applied this framework to a public domain EEG dataset (OpenMIIR) to deconvolve overlapping EEG responses to various musical features in continuous music. In particular, the proposed framework investigated the EEG encoding of the following features: tonal stability, key clarity, beat, and meter. This analysis revealed a differential spatiotemporal neural encoding of beat and meter, but not of tonal stability and key clarity. The results demonstrate that this framework can uncover associations of ongoing brain activity with relevant musical features, which could be further extended to other relevant measures such as time-resolved emotional responses in future studies.

Highlights

  • Music is a universal auditory experience known to evoke intense feelings

  • We did not find a significant increase of prediction accuracy for either key clarity or tonal stability calculated on each beat or measure (Eq 6-1 vs. Eq 5-2, minimum cluster-p = 0.1211; Eq 6-2 vs. Eq 5-2, minimum cluster-p = 0.0762; Supplementary Figures 2–5)

  • The control analysis revealed that the Multivariate Temporal Response Function (mTRF) analysis sensitively detects envelope tracking compared to models with phaserandomized envelopes (Figure 3)

Read more

Summary

Introduction

Humans connect to it on an emotional level but can generate expectations as they listen to it (Koelsch et al, 2000). We gather clues from what we are listening to in real-time combined with internalized musical patterns, or schema, from our respective cultural settings to guess what will happen which results in a change in our emotions. Schemata consist of musical features, such as tonality (i.e., pitches and their relationship to one another) and rhythm. Tonality has often been studied using heavily contrived chord progressions instead of more natural, original music in order to impose rigorous controls on the experiment (Fishman et al, 2001; Loui and Wessel, 2007; Koelsch and Jentschke, 2010). Beat perception studies have favored simplistic, isolated rhythms over complex patterns found

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call