Abstract

Physiological and behavioral measures allow computing devices to augment user interaction experience by understanding their mental load. Current techniques often utilize complementary information between different modalities to index load level typically within a specific task. In this study, we propose a new approach utilizing the timing between physiology/behavior change events to index low and high load level of four task types. Findings from a user study where eye, speech, and head movement data were collected from 24 participants demonstrate that the proposed measures are significantly different between low and high load levels with high effect size. It was also found that voluntary actions are more likely to be coordinated during tasks. Implications for the design of multimodal-multisensor interfaces include (i) utilizing event change and interaction in multiple modalities is feasible to distinguish task load levels and load types and (ii) voluntary actions should be allowed for effective task completion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call