Abstract

ObjectiveTo develop a deep-learning framework to predict lower-limb joint kinematics from IMU data across multiple gait tasks (walking, jogging, and running) and evaluate the impact of dynamic time warping (DTW) on reducing prediction errors. Patients and MethodsData were collected from 18 participants fitted with IMUs and an optical motion capture (OMC) system between May 25, 2023, to May 30, 2023. A long-short-term memory (LSTM) autoencoder supervised regression model was developed. The model consisted of multiple LSTM and convolution layers. Acceleration and gyroscope data from the IMUs in three axes and their magnitude for the proximal and distal sensors of each joint (hip, knee, ankle) were inputs to the model. OMC kinematics were considered ground truth and used as an output to train the prediction model. ResultsThe deep-learning models achieved a root mean square error (RMSE) of less than 6° for hip, knee, and ankle joint sagittal plane angles, with the ankle showing the lowest error (5.1°). Task-specific models demonstrated enhanced performance during certain gait phases, such as knee flexion during running. The application of DTW significantly reduced RMSE across all tasks by at least 3-4°. External validation of independent data confirmed the model’s generalizability. ConclusionOur findings underscore the potential of IMU-based deep-learning models for joint kinematic predictions, offering a practical solution for remote and continuous biomechanical assessments in healthcare and sports science.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.