Abstract

Human faces play a significant role in determining a person's mood. Emotion plays an important role in various fields, including biomedical engineering, brain science, neuroscience, psychological wellness, and mental health. Emotions are not only used for determination of the state of the human brain but too utilized as a proposal framework to help people in finding things that coordinate their needs and inclinations. This motivated us to create a framework that can viably and proficiently recognize feelings from the facial expressions of the user and recommend the YouTube videos based on that emotion. Emotions are bought by the neurophysiological changes that are associated with thoughts, feelings and behavioral changes in a person. Every emotion needs to be treated in a right way and research suggests that watching YouTube videos influences your mood, and in our experiment, we try to influence it in a good way. We can detect changes in emotion by observing facial expressions, body language, and tone of voice. There are many indicators of a person's mood, but facial expressions are one of the most important. In our experiment, we used the Haar Cascade Classifier to detect the face and a convolution neural network to analyze the facial expression. Performance of our proposed model is 95.66% for face detection and 61.88% for emotion detection. Key Words: Emotion Detection, FER-2013, HAAR Cascade, YouTube Recommendation, CNN.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.