Physiotherapy plays a crucial role in the rehabilitation of damaged or defective organs due to injuries or illnesses, often requiring long-term supervision by a physiotherapist in clinical settings or at home. AI-based support systems have been developed to enhance the precision and effectiveness of physiotherapy, particularly during the COVID-19 pandemic. These systems, which include game-based or tele-rehabilitation monitoring using camera-based optical systems like Vicon and Microsoft Kinect, face challenges such as privacy concerns, occlusion, and sensitivity to environmental light. Non-optical sensor alternatives, such as Inertial Movement Units (IMUs), Wi-Fi, ultrasound sensors, and ultrawide band (UWB) radar, have emerged to address these issues. Although IMUs are portable and cost-effective, they suffer from disadvantages like drift over time, limited range, and susceptibility to magnetic interference. In this study, a single UWB radar was utilized to recognize five therapeutic exercises related to the upper limb, performed by 34 male volunteers in a real environment. A novel feature fusion approach was developed to extract distinguishing features for these exercises. Various machine learning methods were applied, with the EnsembleRRGraBoost ensemble method achieving the highest recognition accuracy of 99.45%. The performance of the EnsembleRRGraBoost model was further validated using five-fold cross-validation, maintaining its high accuracy.
Read full abstract