Abstract

Emotions play a critical role during learning and problem solving with advanced learning technologies (ALTs). Despite their importance, relatively few attempts have been made to understand learners' emotional monitoring and regulation by using data visualizations of their own (and others') cognitive, affective, metacognitive, and motivational (CAMM) self-regulated learning (SRL) processes to potentially foster their emotion regulation (ER). We present a theoretically based and empirically driven conceptual framework that addresses ER by proposing the use of visualizations of one's own and others' CAMM SRL multichannel data to facilitate learners' monitoring and regulation of emotions during learning with ALTs. We use an example with eye-tracking data to illustrate the mapping between theoretical assumptions, ER strategies, and the types of data visualizations that can enhance learners' ER, including key processes such as emotion flexibility, emotion adaptivity, and emotion efficacy. We conclude with future directions leading to a systematic interdisciplinary research agenda that addresses outstanding ER-related issues by integrating models, theories, methods, and analytical techniques for the cognitive, learning, and affective sciences; human- computer interaction (HCI); data visualization; big data; data mining; and SRL.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call