Abstract

This paper presents a new multimodal Human-Machine Interaction (HMI) scheme-model for the co-operation of a robotic-nurse (here a robotic wheelchair) and its human user. The HMI model processes vocal commands through a Personalized Isolated Word Recognition System (PIWRS) along with the recognition of Body Pose Angles (BPA) for decision-making in real time. In particular, the HMI scheme is able to recognize: (i) a set of voice commands, (ii) a set of body postures and poses and (iii) calculate the appropriate body angles associated to skeletal data obtained through a set of cameras. Furthermore, the HMI scheme receives specific values provided by pressure sensors, which are being utilized by the user throughout the duration of the tasks to be executed that compose the Active Participation System (APS). All these variables are appropriately combined for the safe control of an Autonomous Intelligent Robotic Wheelchair (AIRW) used by people in need. More specifically, the stand-up, turn-around and sit-down are the procedural steps under study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call