Abstract

In the era of twenty-first century, an era characterized by the proliferation of digital technology, big data and so on, the ability to identify human emotions through visual content from images has gained much importance and its popularity is increasing worldwide. This project deals with the task of detecting emotions from images using deep learning techniques with a specific emphasis on Mobile Net-based architectures. We start the project by preparing the dataset of various images showing diverse emotions. The Mobile Net architecture, a powerful convolutional neural network is fine-tuned with a custom dense layer to classify emotions into seven distinct categories. Data argumentation techniques such as zooming, shearing and horizontal flipping are incorporated to enhance robustness and prevent overfitting. The training dataset is preprocessed and normalized while a segregated validation dataset ensures stringent evaluation. During training we implemented early stopping and model checkpoint mechanisms to get optimal performance while avoiding overfitting. After training the analysis of accuracy and loss metrics provides an insight into the model’s trajectory. In practical applicability we use the trained model to predict emotion from single images, showcasing its potential in various domains, including digital marketing, healthcare, and user experience design. In today’s digital landscape the project findings hold relevance for a wide spectrum of applications, promising advancements in human computer interactions and emotion aware systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call