The restoration of fine motor function in the hand is crucial for stroke survivors with hemiplegia to reintegrate into daily life and presents a significant challenge in post-stroke rehabilitation. Current mirror rehabilitation systems based on wearable devices require medical professionals or caregivers to assist patients in donning sensor gloves on the healthy side, thus hindering autonomous training, increasing labor costs, and imposing psychological burdens on patients. This study developed a low-load wearable hand function mirror rehabilitation robotic system based on visual gesture recognition. The system incorporates an active visual apparatus capable of adjusting its position and viewpoint autonomously, enabling the subtle monitoring of the healthy side’s gesture throughout the rehabilitation process. Consequently, patients only need to wear the device on their impaired hand to complete the mirror training, facilitating independent rehabilitation exercises. An algorithm based on hand key point gesture recognition was developed, which is capable of automatically identifying eight distinct gestures. Additionally, the system supports remote audio–video interaction during training sessions, addressing the lack of professional guidance in independent rehabilitation. A prototype of the system was constructed, a dataset for hand gesture recognition was collected, and the system’s performance as well as functionality were rigorously tested. The results indicate that the gesture recognition accuracy exceeds 90% under ten-fold cross-validation. The system enables operators to independently complete hand rehabilitation training, while the active visual system accommodates a patient’s rehabilitation needs across different postures. This study explores methods for autonomous hand function rehabilitation training, thereby offering valuable insights for future research on hand function recovery.
Read full abstract