Abstract

A user's facial expressions can reveal his or her level of emotion. These expressions can be obtained from the system's camera's live feed. In the area of computer vision (CV) and machine learning (ML), a lot of research is being done to train machines to recognize different human emotions or moods. Machine learning offers a variety of methods for detecting human emotions. A review of existing music systems revealed that many music applications rely on the user's past listening choices rather than recommending songs based on their current emotion. The goal of this project is to identify emotions in human faces using real-time data and to suggest songs according on those emotions. Music is a great unifier. It binds us despite our differences in ages, backgrounds, languages, interests and levels of income. Due to its accessibility and ability to be used alongside daily activities, travel, sports, and other activities, music players and other streaming apps are in high demand. Digital music has emerged as the main form of consumer content that many young people are looking for because to the quick growth of mobile networks and digital multimedia technology. Music is frequently used by people as a tool for mood control, specifically to improve mood, boost energy, or soothe tension. Additionally, listening to the correct music at the right moment can help with mental wellness. So, music and feelings in people are closely related. As a result, the proposed system is an interactive platform for suggesting music depending on user’s present emotional state. This also could be a great feature to be incorporated in existing music player applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call