Abstract

Socially shared regulation plays a pivotal role in the success of collaborative learning. However, evaluating socially shared regulation of learning (SSRL) proves challenging due to the dynamic and infrequent cognitive and socio-emotional interactions, which constitute the focal point of SSRL. To address this challenge, this paper gathers interdisciplinary researchers to establish a multi-modal dataset with cognitive and socio-emotional interactions for SSRL study. Firstly, to induce cognitive and socio-emotional interactions, learning science researchers designed a special collaborative learning task with regulatory trigger events among triadic people for the SSRL study. Secondly, this dataset includes various modalities like video, Kinect data, audio, and physiological data (accelerometer, EDA, heart rate) from 81 high school students in 28 groups, offering a comprehensive view of the SSRL process. Thirdly, three-level verbal interaction annotations and non-verbal interactions including facial expression, eye gaze, gesture, and posture are provided, which could further contribute to interdisciplinary fields such as computer science, sociology, and education. In addition, comprehensive analysis verifies the dataset’s effectiveness. As far as we know, this is the first multimodal dataset for studying SSRL among triadic group members.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call