Abstract

Emotions are a fundamental part of human behavior and can be stimulated in numerous ways. In real-life, we come across different types of objects such as cake, crab, television, trees, etc., in our routine life, which may excite certain emotions. Likewise, object images that we see and share on different platforms are also capable of expressing or inducing human emotions. Inferring emotion tags from these object images has great significance as it can play a vital role in recommendation systems, image retrieval, human behavior analysis and, advertisement applications. The existing schemes for emotion tag perception are based on the visual features, like color and texture of an image, which are poorly affected by lightning conditions. The main objective of our proposed study is to address this problem by introducing a novel idea of inferring emotion tags from the images based on object-related features. In this aspect, we first created an emotion-tagged dataset from the publicly available object detection dataset (i.e., “Caltech-256”) using subject evaluation from 212 users. Next, we used a convolutional neural network-based model to automatically extract the high-level features from object images for recognizing nine (09) emotion categories, such as amusement, awe, anger, boredom, contentment, disgust, excitement, fear, and sadness. Experimental results on our emotion-tagged dataset endorse the success of our proposed idea in terms of accuracy, precision, recall, specificity, and F1-score. Overall, the proposed scheme achieved an accuracy rate of approximately 85% and 79% using top-level and bottom-level emotion tagging, respectively. We also performed a gender-based analysis for inferring emotion tags and observed that male and female subjects have discernment in emotions perception concerning different object categories.

Highlights

  • Emotions represent the mental and psychological state of a human being and are a crucial element of human daily living behavior [1,2]

  • The authors in [21] presented eight emotion categories to study human emotion perception, where happiness emotion is split into four positive emotions, namely amusement, excitement, awe, and contentment, while negative sentiments are further categorized as anger, fear, disgust, and sad emotions [14,22]

  • This section explains the series of analysis along with comprehensive performance and empirical results to evaluate our proposed scheme for inferring emotion tag based on object categories

Read more

Summary

Introduction

Emotions represent the mental and psychological state of a human being and are a crucial element of human daily living behavior [1,2]. For addressing the challenges discussed above, in this study, we explore the implications of inferring emotion tags from images based on object analysis and categorization. Our proposed scheme is envisaged on the hypothesis that we can accurately infer emotion tags from the images using object-related features to alleviate the existing challenges associated with visual features (such as image contrast, brightness, and color information). To the best of our knowledge, no one else has reported such research work that infers the emotion tags from images based on object analysis. We proposed a novel idea for inferring emotion tags from images based on object classification In this aspect, we used a public domain object detection dataset, i.e., Caltech-256, for implementation and validation of our proposed idea using different supervised machine learning classifiers.

Proposed Methodology
Data Acquisition
Data Annotation and Tagging
Emotion Tag Assignment to Bottom-Level Object Categories
Emotion Tag Assignment to Top-Level Object Categories
Gender-Based Emotion Tag Assignment
Feature Extraction Using Off-the-Shelf CNN
Feature Classification
Results and Discussion
Method of Analysis and Classifiers
Performance Analysis for Inferring Emotion Tags at Top-Level
Performance Analysis for Inferring Emotion Tags at Bottom-Level
Comparison with the State-of-the-Art Study
Results
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.