Abstract

Freezing of gait (FoG) is a serious gait disorder commonly seen in patients with advanced Parkinson's disease (PD). It is of great interest to pre-judge the occurrence of FoG before its actual onset, thus making it possible to prevent the occurrence of gait freezing by providing suitable external cues. Most of the previous FoG prediction approaches focused on using manually selected gait features for FoG prediction. However, the extraction of these gait features heavily depends on expert knowledge and may be insufficient to well represent different gait classes, resulting in a poor generalization of the prediction models based on these features. In contrast, the deep learning model can learn discriminative features in a latent high-dimensional space to cover inter-patient variations, and has the potential to provide more excellent prediction performance. In this study, the manually selected features are embedded into a deep neural network, the ResNeXt, for better learning the discriminative class-specific gait features. Our experiments are conducted on the public Daphnet dataset, where gait recordings of ten PD patients are collected and eight patients demonstrate FoG epochs. In our experiments, different segment lengths and different pre-FoG durations were explored, and the best performance was obtained when the pre-FoG duration and the segment length is set to be 5 s and 1 s, respectively. The best prediction accuracy is 95.40% with an MF1 score of 0.89 and a Kappa coefficient of 0.87. The comparison with other state-of-the-arts indicates that the proposed approach can provide competitive performance for FoG prediction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.