Abstract

Objectives: The emotion detection is one of the important fields in computer human interaction and this study plays significant a role for identification facial expression from the images. To identify the single emotion, need a various variability of human shapes such as pose, color, texture, expression, posture and orientation. In this study, we implement Local Binary Pattern (LBP) based filters for identifying the dynamic face textures. And moreover, this approach also provides extension and simplification. Methods/Statistical Analysis: We used built-in FER2013 datasets, the database consisting seven classes (Surprise, Fear, Angry, Neutral, Sad, Disgust, Happy). The dataset is divided into three parts testing, validation and training (15% and 70%). The Convolution neural network is trained with feature Descriptor Local Binary Pattern. Findings: The experimental results have demonstrated that local LBP representations are effective in spatial dynamic feature extraction, as they encode the information of image texture configuration while providing local structure patterns. The advantages of our approach include local processing, robustness to monotonic grayscale changes and simple computation. The results show that, the performance LBP based Convolution Neural Network (CNN) model is better than conventional CNN. This research study further helps in image classification and image processing fields. Application/Improvements: It is recommended that LBP should be used for finding the local regions or pattern from the image. The LBP computation and local processing is quite better with robustness and monotonic changes. Keywords: Convolution Neural Network (CNN), Facial Emotion, Facial Expression, Face detection, Expressions, Local Binary Pattern (LBP)

Highlights

  • Facial expression is one of most important way for verbal communication, and it demonstrates internal affective intentions and states

  • In final stage of automatic emotion expression is classify the various emotion-based expressions from extract total features using different classification techniques such as Support Vector Machine (SVM)8 and Simple Neural Network (SNN)9, Hidden Markov Models (HMM)10, K-Nearest Neighbor (KNN)11 and rule-based classifiers12 has been implemented for emotion detection and expression recognition

  • By merging with Local Binary Pattern (LBP) feature extraction operator with Convolution Neural Network (CNN), we achieved 74% testing accuracy which is quite better than HOGCNN16

Read more

Summary

Introduction

Facial expression is one of most important way for verbal communication, and it demonstrates internal affective intentions and states. Face acquisition is basic processing stage where we find the important region from the input image. After locating and identifying the face regions, the second step is to extract the features from the input original image to show the facial expressions. This task can be done through the two suggested techniques: a) Appearance based methods, and b) Geometric based methods. There are some image filters are used to extract the complete face features and extract the important region form the input image and try to find the facial appearance changes with help of Linear Discriminant Analysis (LDA), Principal Component Analysis (PCA) and Gabor Wavelet Analysis (GWA). In final stage of automatic emotion expression is classify the various emotion-based expressions from extract total features using different classification techniques such as Support Vector Machine (SVM) and Simple Neural Network (SNN), Hidden Markov Models (HMM), K-Nearest Neighbor (KNN) and rule-based classifiers has been implemented for emotion detection and expression recognition

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call