Abstract

Music is an exemplary tool to judge a person’s emotional state. It is the language of the soul. What cannot be articulated through words are easily conveyed through a melody. Music not only speaks to a person’s emotional and mental state, but it is also known to have a therapeutic effect on the listener. The traditional method of music recommendation uses collaborative or content-based filtering to recommend songs but a person’s song choices does not depend only on the song they usually listen to but depends mostly on their emotional state. With the fast-paced innovations pertaining to the music application industry, there is still scope of further improvement in the user experience and creating an encompassing application that not only allows the app users to enjoy listening to their favorite songs but also caters to their recommendation based on their emotional state. Thus, an emotion detection system using Convolution Neural Networks has been proposed. The user feeds in a custom playlist containing a mixture of musical genres that are classified into different emotions using K-Means Clustering. The CNN model detects the emotional state of the user and recommends a series of songs from the classified playlist. This interactive interface is a revolutionary innovation for users who need song recommendations that suit their current mind-state.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call