Abstract

Understanding existing gender differences in the development of computational thinking skills is increasingly important for gaining valuable insights into bridging the gender gap. However, there are few studies to date that have examined gender differences based on the learning process in a realistic classroom context. In this work, we aim to investigate gender classification using students’ eye movements that reflect temporal human behavior during a computational thinking lesson in an immersive VR classroom. We trained several machine learning classifiers and showed that students’ eye movements provide discriminative information for gender classification. In addition, we employed a Shapley additive explanation (SHAP) approach for feature selection and further model interpretation. The classification model trained with the selected (best) eye movement feature set using SHAP achieved improved performance with an average accuracy of over 70%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$70\\%$$\\end{document}. The SHAP values further explained the classification model by identifying important features and their impacts on the model output, namely gender. Our findings provide insights into the use of eye movements for in-depth investigations of gender differences in learning activities in VR classroom setups that are ecologically valid and may provide clues for providing personalized learning support and tutoring in such educational systems or optimizing system design.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call