Abstract

Music has become an integral part of everybody's life. We currently have a variety of music applications that can be connected to the internet and view suggested or related songs based on the playlist of the user, enabling users to exchange playlists and categorize songs into different genres.This paper proposes a system which would help the user identify their emotions through facial analysis and be able to listen accordingly to the best suited songs. This speeds up the process of finding the best suited songs by eliminating the manual work. Our proposed system uses Microsoft emotion recognition for facial analysis, a system that has already analyzed emotions and has the MicrosoftFace API which has analyzed over 1 million faces and presents an average true positive value up to 60 %. This API helps capture and evaluate emotion from an image in the application. Within this system, computer vision components are usedto assess the emotion of the user by facial expressions. The camera of the device captures the user's image. The system can assess the user's emotions and map to predefined playlists based on the captured emotions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.