Abstract

We propose a hand gesture-operated system as an AI application to relieve discomfort and restore function in hand and arm movements caused by injuries and nerve and muscle complications. The system trains patients with hand exercises, such as performing hand gestures accurately, traversing within specified bounds, and operating a hand gesture calculator. However, the system requires accurate hand gesture detection, which is impeded by background clutter and variations in illumination and in the hand's size and angle. To address this, we developed a robust hand detection module that uses a single-stage transformer deep network. The transformer network encodes global information and uses bipartite matching to reduce the frequency of spurious detections. It drives a regression head and a classification head to localize the hand gesture in a bounding box and assign it a class label. Hand keypoints are also detected to support drawing, path traversal, and calculator use. The approach is evaluated on two benchmark datasets: OUHANDS and NUS. The method yields 89.6% accuracy for OUHANDS and 100% for NUS. These results indicate that precise hand detection can support a robust system for rehabilitation through hand exercises. Our experiments confirm that the users' hand function progressively improved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call