Abstract
Technological advances and the explosion of epidemics have contributed to a surge in the number of online learning platforms. Because single modal data is often not enough in evaluating the usability of interface interaction design for online learning platforms and multimodal data (eye movement, Electroencephalogram, skin conductance response) with advanced sensing technologies provide new possibilities to address this issue, this case study explores how multimodal data can be used to evaluate the interface design for our self-developed collaborative reading system. The results of our randomized between-subject experiments showed that, from eye movement analysis, constructive-level annotations prompt students to allocate more attention to the annotation area than active level annotations and facilitate the transition between the annotation and reading areas. From EEG data analysis, all the students stayed high concentration levels no matter the types of annotations they were reading. From SCRs analysis, although no significant difference in the level of excitement between experimental conditions was identified, students showed great individual difference within the same conditions. This study illustrated how multimodal data can be applied to interface design evaluation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.