Abstract

This paper explores a novel direction in music-induced emotion (music emotion) analysis - the effects of different genres on the prediction of music emotion. We aim to compare the performance of various classifiers in the prediction of the emotion induced by music, as well as to investigate the adaptation of advanced features (such as asymmetries) in improving classification accuracy. The study is supported by real-world experiments where 10 subjects listened to 20 musical pieces from 5 genres- classical, heavy metal, electronic dance music, pop and rap, during which electroencephalogram (EEG) data were collected. A maximum 10-fold cross-validation accuracy of 98.4% for subject-independent and 99.0% for subject-dependent data were obtained for the classification of short instances of each song. The emotion of popular music was shown to have been most accurately predicted, with a classification accuracy of 99.6%. Further examination was conducted to investigate the effect of music emotion on the relaxation of subjects while listening.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.