Abstract

Interactive learning environments with body-centric technologies lie at the intersection of the design of embodied learning activities and multimodal learning analytics. Sensing technologies can generate large amounts of fine-grained data automatically captured from student movements. Researchers can use these fine-grained data to create a high-resolution picture of the activity that takes place during these student–computer interactions and explore whether the sequence of movements has an effect on learning. We present a use-case modelling of temporal data in an interactive learning environment with hand gestures, and discuss some validity threats if temporal dependencies are not accounted for. In particular, we assess how, if ignored, the temporal dependencies in the measurement of hand gestures might affect the goodness of fit of the statistical model and would affect the measurement of the similarity between elicited and enacted movement. Our findings show that accounting for temporality is crucial for finding a meaningful fit to the data. In using temporal analytics, we are able to create a high-resolution picture of how sensorimotor coordination correlates with learning gains in our learning system.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.