Abstract

Expressions and body language can tell us a lot about what people are thinking. They are a form of non-verbal communication which tells us about how the person is feeling. It describes the mood of the person like whether he is happy or sad. This detection can be done using various techniques which are already based in the research papers like instrumented sensor technology and computer vision. It means that the expressions can be classified under different techniques like whether motion of the person is still or he is moving. This paper focuses on detecting the emotions of the person using computer vision. Using the Artificial Intelligence Technique and Mediapipe along with Computer Vision we are focusing on various joints in our body and storing their coordinates in a python file created there and then testing our Algorithm to detect the mood of the person. In addition, a dialogue box also pops us while detecting the emotions which tells us about the probability i.e the accuracy of our detection and also tells us about which emotion it is. The current model consists of three emotions, they are happy, sad and victorious i.e Gestures are detected. The algorithm is focusing on the difference between the coordinates and detect the emotions. Detection at a distance might be an issue as the coordinates will be different then. This paper is a thorough general overview of Body Gesture Detection with a brief description of things which are going to be there.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call