Abstract

Ensuring student engagement is crucial for effective learning outcomes in any classroom setting, including e-learning environments. However, the absence of immediate supervision in online classes makes monitoring and maintaining student attentiveness difficult. To address this challenge, this study proposes a cognitive state detection system that continuously monitors the facial emotion of the learner in an adaptive learning environment. The algorithm is proposed to detect cognitive states such as attentiveness and inattentiveness. The system has been implemented on four separate databases and evaluated using three ensemble models: FT-EDFA, FC-EDFA, and OT-EDFA. The ensemble models have been created by applying transfer learning to two popular pre-trained models, VGG19 and ResNet50, which can learn useful features from facial images for emotion recognition tasks. Combining the features learned by both models, the ensemble approach can achieve better performance in recognising facial emotions. The proposed system can provide continuous feedback to instructors, enabling them to adjust their teaching methods to maintain student engagement and interest. The study has achieved promising results, surpassing the performance of existing methods with recognition rates of 93.11%, 92.34%, and 91.12% on the newly created dataset. By detecting cognitive states in online learners, the proposed system can help instructors understand how engaged and interested their students are during classes. Overall, facial emotion recognition can be useful for improving the quality of e-learning platforms and enhancing student learning outcomes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call