Abstract

For longitudinal behavior analysis, task type is an inevitable and important variable. In this article, we propose an event-based behavior modeling approach and employ non-invasive wearable sensing modalities (eye activity, speech and head movement) to recognize task load level under four different task load types. The novelty lies in converting physiological and behavioral signals into meaningful events and utilizing their sequence across multiple modalities to distinguish load levels and types. We evaluated this approach on head-worn sensor data from 24 participants completing four different tasks for recognizing (i) low and high load level for a given task load type, (ii) low and high load level regardless of load type, and (iii) both load level and load type. Findings show that the recognition rate is reasonable in (i), close to chance level in (ii), and well above chance level in (iii) for 8 classes using participant-dependent and -independent schemes. Further, a fusion of the proposed event-based features and conventional continuous features achieved the best or similar performance in most cases. These results suggest that task type needs to be considered when using continuous features and that the proposed event-based modeling paradigm is promising for longitudinal behavior analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call