Abstract

Most music messages express emotional signals. The emotional classification of music can organize and retrieve music more conveniently. Music emotion recognition refers to the music emotion detected from people's annotations. Considering the low music emotion recognition rate of traditional music emotion recognition models, a music emotion recognition model based on deep learning is proposed in this paper. It uses deep learning to analyze the dynamic emotion recognition of music and adopts the VA model to generate a set of VA values for a song. Specifically, the related background and emotion classification model of music emotion recognition is first elaborated. Then, the representation of emotion is explained. Next, the overall framework of dynamic music emotion recognition is designed. Finally, the convolutional long- and short-time neural network and BiLSTM are combined to perform music dynamic VA, and this method is compared with other related identification methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call