Abstract
SummaryThe ability of facial expression recognition, the ability to decipher human emotions from facial features, and we can judge the emotional state of human beings by analysing facial expressions. Empowered by this technology, the related research of student sentiment analysis has become a focal point in the realm of educational technology. Teachers can now read the emotional state of their students and estimate the effectiveness of their teaching strategies, enabling them to implement appropriate intervention techniques to enhance teaching outcomes. However, facial expression recognition techniques are currently limited by their shortcomings. Network performance degradation and loss of feature information are key issues that hinder the effectiveness of sentiment analysis of student feedback. To overcome these limitations, in this paper, we propose RFMNet, a novel network model based on deep learning theory. RFMNet, with its higher extraction capability, is designed to accurately analyse student expressions. It introduces a relation‐aware global attention (RGA) module, which facilitates the integration of more discriminative expression features in the image, leading to more refined sentiment analysis. This innovative model also employs a Mish activation function and a focal loss function. The Mish activation function is of particular interest because it helps to avoid the loss of feature information due to neuron deactivation when the ReLU gradient is . This approach results in a more robust and accurate facial expression recognition model. Our model was tested on the FERPlus public dataset. The model achieved an average recognition accuracy of 89.62%, which is a testament to its performance in decoding the facial expressions of students. This precision allows for intelligent processing of teaching information and real‐time feedback, enabling teachers to adapt their teaching strategies to better suit the needs of their students.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Concurrency and Computation: Practice and Experience
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.