When in front of a classroom, a skilled teacher can read the room, identifying when students are engaged, frustrated, distracted, etc. In recent years we have seen significant changes in the traditional classroom, with virtual classes becoming a normal learning environment. Reasons for this change are the increased popularity of Massive Open Online Courses (MOOCs) and the disruptions imposed by the ongoing COVID-19 pandemic. However, it is difficult for teachers to read the room in these virtual classrooms, and researchers have begun to look at using sensors to provide feedback to help inform teaching practices. The study presented here sought to ground classroom sensor data in the form of electrodermal activities (EDA) captured using a wrist-worn sensing platform (Empatica E4), with observations about students' emotional engagement in the class. We collected a dataset from eleven students over eight lectures in college-level computer science classes. We trained human annotators who provided ground truth information about student engagement based on in-class observations. Inspired by related work in the field, we implemented an automated data analysis framework, which we used to explore momentary assessments of student engagement in classrooms. Our findings surprised us because we found no significant correlation between the sensor data and our trained observers' data. In this paper, we present our study and framework for automated engagement assessment, and report on our findings that indicate some of the challenges in deploying current technology for real-world, automated momentary assessment of student engagement in the classroom. We offer reflections on our findings and discuss ways forward toward an automated reading the room approach.
Read full abstract