Abstract

Human-machine interaction (HMI) technology shows an important application prospect in rehabilitation medicine, but it is greatly limited by the unsatisfactory recognition accuracy and wearing comfort. Here, this work develops a fully flexible, conformable, and functionalized multimodal HMI interface consisting of hydrogel-based sensors and a self-designed flexible printed circuit board. Thanks to the component regulation and structural design of the hydrogel, both electromyogram (EMG) and forcemyography (FMG) signals can be collected accurately and stably, so that they are later decoded with the assistance of artificial intelligence (AI). Compared with traditional multichannel EMG signals, the multimodal human-machine interaction method based on the combination of EMG and FMG signals significantly improves the efficiency of human-machine interaction by increasing the information entropy of the interaction signals. The decoding accuracy of the interaction signals from only two channels for different gestures reaches 91.28%. The resulting AI-powered active rehabilitation system can control a pneumatic robotic glove to assist stroke patients in completing movements according to the recognized human motion intention. Moreover, this HMI interface is further generalized and applied to other remote sensing platforms, such as manipulators, intelligent cars, and drones, paving the way for the design of future intelligent robot systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call