Abstract
This paper presents a novel multimodal interaction system that provides three modalities of human interaction to manage the supervision and control of a home automation system for people with limited physical mobility. The first one is based on a brain-control interface (BCI, Brain Computer Interface) by means of surface electroencephalographic electrodes (EEG) using a noninvasive Neurosky wearable. In this case, the control of the BCI is managed by the blinking of human eyes. The second one is based on a voice recognition system using spoken commands on a dialogue system. Finally, the third one is based on a configurable touch screen of a mobile device. All three-interaction modalities can be interchangeable according to user needs. The multimodal interaction system allows the control of home devices and appliances through a home gateway implemented on a resource-limited embedded system, which it is responsible to apply the user commands detected by the multimodal interface to the corresponding home devices. The set of commands can be configurable and extensible adapted to the needs and abilities of different users. For this research, a prototype of the system was developed to verify the interaction modalities. The system provided an adequate and comprehensible operation for users with different user profiles. Our initial tests show that multimodal control is valid for users who have limited physical mobility.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.