Abstract

Vehicles with better usability have become increasingly popular due to their ease of operations and safety for driving. However, the way how usability of in-vehicle system user interface is studied still needs improvement. This paper concerns how to use advanced computational, neurophysiology- and psychology-based tools and methodologies to determine affective (emotional) states and behavioral data of an individual in real time and in turn how to adapt the human-vehicle interaction to meet the user’s cognitive needs based on this real-time assessment. Specifically, we set up a set of neuro-physiological equipment that is capable of collecting EEG, facial EMG (electromyography), skin conductance response, and respiration data and a set of motion sensing and tracking equipment that is capable of eye ball movement and objects that the user interacts. All hardware components and software is integrated into a cohesive augmented sensor platform that can perform as “one coherent system” to enable multi-modal data processing and information inference for context-aware analysis of affective and cognitive states based on the rough set inference engine. Meanwhile subjective data is also recorded for comparison. A usability study of in-vehicle system UI is shown to demonstrate the potential of the proposed methodology.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.