Abstract

Human emotion detection is necessary for social interaction and plays an important role in our daily lives. Artificial intelligence research is rising, focusing on automated emotion detection. The capability to identify the emotion, which is considered one of the traits of emotional intelligence, is a component of human intelligence. Although the study is limited dependent on facial expressions or voice is flourishing, it is identifying emotions via body movements, a less researched issue. To attain emotional intelligence, this study suggests a deep learning approach. Here initially the video can be converted into image frames after the converted image frames can be preprocessed using the Glitter bandpass butter worth filter and contrast stretch histogram equalization. Then from the enhanced image, the features can be clustered using the hybrid Gaussian BIRCH algorithm. Then the specialized features are retrieved from the body of human gestures using the AdaDelta bacteria foraging optimization algorithm, and the selected features are fed to a supervised Kernel Boosting LENET deep-learning algorithm. The experiment is conducted using Geneva multimodal emotion portrayals (GEMEPs) corpus data set. This data set includes, human body gestures portraying the archetypes of five emotions, such as anger, fear, joy, pride, and sad. In these emotion detection techniques, the suggested Kernel Boosting LENET classifier achieves 98.5% accuracy, 94% precision, 95% sensitivity, and F-Score 93% outperformed better than the other existing classifiers. As a result, emotional acknowledgment may help small and medium enterprises (SMEs) to improve their performance and entrepreneurial orientation. The correlation coefficient of 188 and the significance coefficient of 0.00 show that emotional intelligence and SMEs performance have a significant and positive association.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call