Abstract

The human face plays a significant role in interpreting emotional states. Most nonverbal communication between humans occurs through changes in facial expressions. People listen to music to lift their spirits, calm their nerves, and re-energize them. It also hints that hearing the right song at the right time can have a positive effect on one's mood. Now more than ever, thanks to the proliferation of mobile networks and digital multimedia, music is an integral part of many young people's daily lives. Conversely, music has been shown to significantly impact listeners' emotional states. People of all ages, nationalities, languages, economic standings, social standings, and demographic groups can find common ground via shared appreciation of music. Music players and streaming apps are in high demand since users may listen to their music whenever and wherever they like. The study proposes a mood-based music playback system that can identify the user's emotional state in real time and make song recommendations accordingly. It uses a webcam to record the human face in all its expressive glory. Using this information, a playlist of songs that are congruent with the “mood” determined from previous facial expressions can be built. Music players that analyse facial expressions use a set of criteria to scan the user's face and then play songs based on what they see.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.