Abstract

Usability of in-vehicle systems has become increasingly important for ease of operations and safety of driving. The user interface (UI) of in-vehicle systems is a critical focus of usability study. This paper studies how to use advanced computational, physiology- and behavior-based tools and methodologies to determine affective/emotional states and behavior of an individual in real time and in turn how to adapt the human-vehicle interaction to meet users' cognitive needs based on the real-time assessment. Specifically, we set up a set of physiological sensors that are capable of collecting EEG, facial EMG, skin conductance response, and respiration data and a set of motion sensing and tracking equipment that is capable of capturing eye ball movement and objects which the user is interacting with. All hardware components and software are integrated into an augmented sensor platform that can perform as “one coherent system” to enable multimodal data processing and information inference for context-aware analysis of emotional states and cognitive behavior based on the rough set inference engine. Meanwhile subjective data are also recorded for comparison. A usability study of in-vehicle system UI is shown to demonstrate the potential of the proposed methodology.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.