Abstract

When it comes to conveying sentiments and thoughts, facial expressions are quite effective. For human-computer collaboration, data-driven animation, and communication between humans and robots to be successful, the capacity to recognize emotional states in facial expressions must be developed and implemented. Recently published studies have found that deep learning is becoming increasingly popular in the field of image categorization. As a result, to resolve the problem of facial expression recognition (FER) using convolutional neural networks (CNN), increasingly substantial efforts have been made in recent years. Facial expressions may be acquired from databases like CK+ and JAFFE using this novel FER technique based on activations, optimizations, and regularization parameters. The model recognized emotions such as happiness, sadness, surprise, fear, anger, disgust, and neutrality. The performance of the model was evaluated using a variety of methodologies, including activation, optimization, and regularization, as well as other hyperparameters, as detailed in this study. In experiments, the FER technique may be used to recognize emotions with an Adam, Softmax, and Dropout Ratio of 0.1 to 0.2 when combined with other techniques. It also outperforms current FER techniques that rely on handcrafted features and only one channel, as well as has superior network performance compared to the present state-of-the-art techniques.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.