Abstract

Music is an important carrier of emotion and an indispensable factor in people’s daily life. With the rapid growth of digital music, people’s demand for music emotion analysis and retrieval is also increasing. With the rapid development of Internet technology, digital music has been derived continuously, and automatic recognition of music emotion has become the main research focus. For music, emotion is the most essential feature and the deepest inner feeling. Under the ubiquitous information environment, revealing the deep semantic information of multimodal information resources and providing users with integrated information services has important research and application value. In this paper, a multimodal fusion algorithm for music emotion analysis is proposed, and a dynamic model based on reinforcement learning is constructed to improve the analysis accuracy. The model dynamically adjusts the emotional analysis results by learning the user’s behavior, so as to realize the personalized customization of the user’s emotional preference.

Highlights

  • With the rapid growth of the number of digital music, the traditional music analysis and retrieval methods are more and more difficult to meet people’s needs

  • Music feature analysis is a very important step in the process of music emotion analysis and the multimodal music emotion analysis method is to analyze music emotion based on music content and lyrics, respectively, and combine the two analysis results to get the final music emotion analysis. e relationship between music structured information and human emotion cannot be fully reflected by using existing common features

  • Based on reinforcement learning technology, this paper studies the emotional analysis of music from the perspective of audio visualization

Read more

Summary

Introduction

With the rapid growth of the number of digital music, the traditional music analysis and retrieval methods are more and more difficult to meet people’s needs. Music is a symbol used by performers to express their thoughts and convey their emotions Erefore, emotion-based music retrieval is one of the key research contents of music information retrieval systems [7]. As an important means of automatic music retrieval, classifying music according to the expressed emotion is attracting the attention of researchers from different fields. We discussed the analysis of music features based on lyrics and the music emotion feature analysis method based on multimodal fusion to improve the analysis accuracy. Erefore, we can further explore the feature extraction method with more musical emotion analysis ability.

Related Work
Last frame
Music Emotion Feature Analysis Method Based on Multimodal Fusion
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call