Abstract

In the context of education technology, empathic interaction with the user and feedback by the learning system using multiple inputs such as video, voice and text inputs is an important area of research. In this paper, a nonintrusive, standalone model for intelligent assessment of alertness and emotional state as well as generation of appropriate feedback has been proposed. Using the non-intrusive visual cues, the system classifies emotion and alertness state of the user, and provides appropriate feedback according to the detected cognitive state using facial expressions, ocular parameters, postures, and gestures. Assessment of alertness level using ocular parameters such as PERCLOS and saccadic parameters, emotional state from facial expression analysis, and detection of both relevant cognitive and emotional states from upper body gestures and postures has been proposed. Integration of such a system in e-learning environment is expected to enhance students performance through interaction, feedback, and positive mood induction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call