Abstract

This paper presents a study on multi-modal human emotional state detection while riding a powered wheelchair (PMV; Personal Mobility Vehicle) in an indoor labyrinth-like environment. The study reports findings on the habituation of human stress response during self-driving. In addition, the effects of “loss of controllability”, change in the role of the driver to a passenger, are investigated via an autonomous driving modality. The multi-modal emotional state detector sensing framework consists of four sensing devices: electroencephalograph (EEG), heart inter-beat interval (IBI), galvanic skin response (GSR) and stressor level lever (in the case of autonomous riding). Physiological emotional state measurement characteristics are organized by time-scale, in terms of capturing slower changes (long-term) and quicker changes from moment-to-moment. Experimental results with fifteen participants regarding subjective emotional state reports and commercial software measurements validated the proposed emotional state detector. Short-term GSR and heart signal characterizations captured moment-to-moment emotional state during autonomous riding (Spearman correlation; ρ = 0.6, p < 0.001). Short-term GSR and EEG characterizations reliably captured moment-to-moment emotional state during self-driving (Classification accuracy; 69.7). Finally, long-term GSR and heart characterizations were confirmed to reliably capture slow changes during autonomous riding and also of emotional state during participant resting state. The purpose of this study and the exploration of various algorithms and sensors in a structured framework is to provide a comprehensive background for multi-modal emotional state prediction experiments and/or applications. Additional discussion regarding the feasibility and utility of the possibilities of these concepts are given.

Highlights

  • There is a need for simple, robust, and streamline technologies in order allow individuals to continue personal control of their living environment, without expensive healthcare servicing and/or time-consuming family support

  • This paper presents a study on multi-modal human emotional state detection while riding a powered wheelchair (PMV; Personal Mobility Vehicle) in an indoor labyrinth-like environment

  • ultra-low frequency (ULF) and very-low frequency (VLF) require long term data for accurate characterization, we focus on the ratio low frequency high frequency (LFHF) which represents sympathetic and parasympathetic equilibrium

Read more

Summary

Introduction

There is a need for simple (easy to use), robust (not damaged), and streamline technologies in order allow individuals to continue personal control of their living environment, without expensive healthcare servicing and/or time-consuming family support. Wheelchair users face challenges with traveling over rough terrain, inaccessible paths, and other traffic related inconveniences of city-life [1]. In such cases, powered wheelchairs are available to assist with daily-life mobility. There are other control options for powered wheelchairs besides joystick, such as head or chin control, sip and puff, and other (eye gaze, tongue, head, hand, foot control) [2]. These control options can cause unwanted attention in public spaces. Autonomous wheelchair control has increasingly become an attractive option, despite the expensive cost and immature development of technologies [3, 4]

Objectives
Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.