The human brain tracks regularities in the environment and extrapolates these to predict future events. Prior work on music cognition suggests that low-frequency (1-8Hz) brain activity encodes melodic predictions beyond the stimulus acoustics. Building on this work, we aimed to disentangle the frequency-specific neural dynamics linked to melodic prediction uncertainty (modelled as entropy) and prediction error (modelled as surprisal) for temporal (note onset) and content (note pitch) information. By using multivariate temporal response function (TRF) models, we re-analysed the electroencephalogram (EEG) from 20 subjects (10 musicians) who listened to Western tonal music. Our results show that melodic expectation metrics improve the EEG reconstruction accuracy in all frequency bands below the gamma range (< 30 Hz). Crucially, we found that entropy contributed more strongly to the reconstruction accuracy enhancement compared to surprisal in all frequency bands. Additionally, we found that the encoding of temporal, but not content, information metrics was not limited to low frequencies, rather it extended to higher frequencies (> 8Hz). An analysis of the TRF weights revealed that the temporal predictability of a note (entropy of note onset) may be encoded in the delta- (1-4Hz) and beta-band (12-30 Hz) brain activity prior to the stimulus, suggesting that these frequency bands associate with temporal predictions. Strikingly, we also revealed that melodic expectations selectively enhanced EEG reconstruction accuracy in the beta band for musicians, and in the alpha band (8-12 Hz) for non-musicians, suggesting that musical expertise influences the neural dynamics underlying predictive processing in music cognition.
Read full abstract