We present a machine learning driven system to monitor joint flexion angles during dynamic motion, using a wearable loop-based sensor. Our approach uses wearable loops to collect transmission coefficient data and an Artificial Neural Network (ANN) with fine-tuned parameters to increase accuracy of the measured angles. We train and validate the ANN for sagittal plane flexion of a leg phantom emulating slow motion, walking, brisk walking, and jogging. We fabricate the loops on conductive threads and evaluate the effect of fabric drift via measurements in the absence and presence of fabric. In the absence of fabric, our model produced a root mean square error (RMSE) of 5.90°, 6.11°, 5.90°, and 5.44° during slow motion, walking, brisk walking, and jogging. The presence of fabric degraded the RMSE to 8.97°, 7.21°, 9.41°, and 7.79°, respectively. Without the proposed ANN method, errors exceeded 35.07° for all scenarios. Proof-of-concept results on three human subjects further validate this performance. Our approach empowers feasibility of wearable loop sensors for motion capture in dynamic, real-world environments. Increasing speed of motion and the presence of fabric degrade sensor performance due to added noise. Nevertheless, the proposed framework is generalizable and can be expanded upon in the future to improve upon the reported angular resolution.
Read full abstract