Abstract

Music is the best medium for expressing emotions. Emotional recognition in music is the mechanism through which the musical fragments detect emotion. A unique way to recognize emotion by musical instrument classes is proposed in this study using deep learning techniques. Instruments present in the music dataset are string, percussion, woodwind, and brass. Four emotions presented by these instruments are neutral, sad, happy, and fear. The data set for instrumental music yields the Chroma energy normalized statistics (CENS), temporal feature ZCR, rolloff, Mel frequency cepstral coefficients (MFCC), bandwidth, Chroma short-time Fourier transform (STFT), spectral features, and spectral centroid. Based on the retrieved data, recurrent neural networks (RNN) are trained to distinguish emotion. Additionally, the performance of RNNs is compared to that of several machine learning classification methods used as a baseline. The results show that a significant performance of instrument emotion recognition can be achieved by combining MFCC characteristics with deep RNN. In addition, it demonstrates that the class of instrument has a significant impact on the emotion evoked by music.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.