Abstract

Emotions can be conveyed through a variety of channels in the auditory domain such as the human voice or music. Recent studies suggest that expertise in one sound category can impact the processing of emotional sounds in other sound categories. We focused here on how the neural processing of emotional information varies as a function of sound category and expertise of participants. Electroencephalogram (EEG) of 20 non-musicians and 17 musicians was recorded while they listened to speech prosody, vocalizations (such as screams and laughter), and musical sounds. The amplitude of EEG-oscillatory activity in the theta, alpha, beta, and gamma band was quantified and Independent Component Analysis (ICA) was used to identify underlying components of brain activity in each band. Sound category-dependent activations were found in frontal theta and alpha, as well as greater activation for musicians than for non-musicians. Differences in the beta band were mainly due to differential processing of speech. The results reflect musicians’ expertise in recognition of emotion-conveying music, which seems to also generalize to emotional expressions conveyed by the human voice, in line with previous accounts of effects of expertise on musical and vocal sounds processing.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call