Abstract
Music emotion is important for listeners’ cognition. With the rapid development of technology, the variety of music has become more diverse and spread faster. However, the cost of music production is still very high. To solve the problem, the AI music composition has gradually gained attention in recent years. The purpose of this study is to establish an automated composition system that includes music, emotions, and machine learning. The system includes the music database with emotional tags as input, and deep learning trains the CVAE-GAN model as the framework to produce the music segments corresponding to the specified emotions. The subjects listen to the results of the system and judge that music corresponds to the original emotion.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.