Wearable technologies such as inertial measurement units (IMUs) can be used to evaluate human gait and improve mobility, but sensor fixation is still a limitation that needs to be addressed. Therefore, aim of this study was to create a machine learning algorithm to predict gait events using a single IMU mimicking the carrying of a smartphone. Fifty-two healthy adults (35 males/17 females) walked on a treadmill at various speeds while carrying a surrogate smartphone in the right hand, front right trouser pocket, and right jacket pocket. Ground-truth gait events (e.g. heel strikes and toe-offs) were determined bilaterally using a gold standard optical motion capture system. The tri-dimensional accelerometer and gyroscope data were segmented in 20-ms windows, which were labelled as containing or not the gait events. A long-short term memory neural network (LSTM-NN) was used to classify the 20-ms windows as containing the heel strike or toe-off for the right or left legs, using 80% of the data for training and 20% of the data for testing. The results demonstrated an overall accuracy of 92% across all phone positions and walking speeds, with a slightly higher accuracy for the right-side predictions (∼94%) when compared to the left side (∼91%). Moreover, we found a median time error <3% of the gait cycle duration across all speeds and positions (∼77 ms). Our results represent a promising first step towards using smartphones for remote gait analysis without requiring IMU fixation, but further research is needed to enhance generalizability and explore real-world deployment.
Read full abstract