Abstract

We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support.

Highlights

  • Adaptive systems can be used to intelligently manage the affective dimension of the learner in order to foster the interplay that exists between the cognitive aspects of learning and affect [1]

  • The purpose of this work was not to get conclusive results but to bear out the main challenges and difficulties involved in emotion detection from facial expressions and body movements in learning settings, aimed to enrich and support a multimodal framework for emotions detection in educational scenarios

  • This paper is aimed at describing related background and the proposed approach for the annotation process, which includes the reported methodology to detect facial expressions, body movements, and associated emotional information when the learner is interacting in a learning environment involving cognitive tasks

Read more

Summary

Introduction

Adaptive systems can be used to intelligently manage the affective dimension of the learner in order to foster the interplay that exists between the cognitive aspects of learning and affect [1]. We focus on reporting the methodology derived from a psychoeducational expert involvement in dealing with the problem of annotating with meaningful predefined tags changes in the affective states of the participants when visualizing recorded videos on their performance while dealing with cognitive tasks in a learning context. This methodology conforms to the data gathered from the learner global interaction, including their task performance and selfreported emotional reports. The ultimate goal is to use these annotations to train a data mining based system that can automatically identify user’s affective state changes and, from them, provide the required affective support

Objectives
Methods
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call