Abstract

Some academics struggle to recognize facial emotions based on pattern recognition. In general, this recognition utilizes all facial features. However, this study was limited to identifying facial emotions in a single facial region. In this study, lips, one of the facial features that can reveal a person's expression, are utilized. Using a combination of local binary pattern feature extraction (LBP) and grey level co-occurrence matrix (GLCM) methods and a multiclass support vector machine classification approach for feature extraction in facial images. The concept begins with image segmentation to create an image of a mouth. Experiments were also conducted for various tests, and the outcomes of these experiments revealed a recognition performance of up to 95%. This result was obtained through experiments in which 10% to 40% of the data were evaluated. These findings are beneficial and can be applied to expression recognition in online learning media to monitor the audience's condition directly.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call