Abstract
Physiological and behavioral measures allow computing devices to augment user interaction experience by understanding their mental load. Current techniques often utilize complementary information between different modalities to index load level typically within a specific task. In this study, we propose a new approach utilizing the timing between physiology/behavior change events to index low and high load level of four task types. Findings from a user study where eye, speech, and head movement data were collected from 24 participants demonstrate that the proposed measures are significantly different between low and high load levels with high effect size. It was also found that voluntary actions are more likely to be coordinated during tasks. Implications for the design of multimodal-multisensor interfaces include (i) utilizing event change and interaction in multiple modalities is feasible to distinguish task load levels and load types and (ii) voluntary actions should be allowed for effective task completion.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.