Abstract

The seven basic facial expressions are the most important indicator of a person's psychological state, regardless of gender, age, culture, or nationality. These expressions are an involuntary reaction that shows up on the face for a short time. They show how the person is feeling—sad, happy, angry, scared, disgusted, surprised, or neutral. The visual system and brain automatically detect a person's emotion through facial expressions. Most computer vision researchers struggle to automate facial expression recognition. Human emotion-detection pioneers have also tried to mimic human automatic detection. Thus, deep learning techniques are the closest to mimicking human intelligence. Despite deep learning techniques, creating a system that can accurately distinguish between facial expressions is still difficult due to the diversity of faces and the convergence of some expressions that express different emotions. This systematic review presents a scientifically rich paper on deep learning-based facial expression emotion detection methods. From 2019 to the present, PRISMA was used to search and select research on real-time emotions. The study collected datasets from the mentioned period that were used to train, test, and verify the models presented in the relevant studies. Each dataset was fully explained in terms of number of items, type of data, etc. The study also compared relevant studies and identified the best technique. Furthermore, challenges to systems that detect emotions through facial expressions have been identified.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call