Abstract
This paper presents a new multimodal control interface for people living with upper-body disabilities based on a wearable wireless sensor network. The proposed body-machine interface is modular and can be easily adapted to the residual functional capacities (RFCs) of different users. A custom data fusion algorithm has been developed for emulating a joystick control using head motion measured with a lightweight wireless inertial sensor enclosed in a headset. The wearable network can include up to six modular sensor nodes which can be used simultaneously to read different RFCs including gesture and muscular activity, and translate them into commands. Sensor data fusion is performed inside the sensor nodes in order to free the wireless link and the base station, and decrease power consumption. Requirements of such an interface are established for people using powered-wheelchairs, and a proof of concept system is implemented and used to control an assistive robotic arm. It is shown that the performance of the system compares well to conventional control systems like the joystick controller, while being potentially more suitable for the severely disabled.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have