Abstract

Music is the best medium for expressing emotions. Emotional recognition in music is the mechanism through which the musical fragments detect emotion. A unique way to recognize emotion by musical instrument classes is proposed in this study using deep learning techniques. Instruments present in the music dataset are string, percussion, woodwind, and brass. Four emotions presented by these instruments are neutral, sad, happy, and fear. The data set for instrumental music yields the Chroma energy normalized statistics (CENS), temporal feature ZCR, rolloff, Mel frequency cepstral coefficients (MFCC), bandwidth, Chroma short-time Fourier transform (STFT), spectral features, and spectral centroid. Based on the retrieved data, recurrent neural networks (RNN) are trained to distinguish emotion. Additionally, the performance of RNNs is compared to that of several machine learning classification methods used as a baseline. The results show that a significant performance of instrument emotion recognition can be achieved by combining MFCC characteristics with deep RNN. In addition, it demonstrates that the class of instrument has a significant impact on the emotion evoked by music.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call