Abstract
BackgroundThe core clinical sign of Parkinson's disease (PD) is bradykinesia, for which a standard test is finger tapping: the clinician observes a person repetitively tap finger and thumb together. That requires an expert eye, a scarce resource, and even experts show variability and inaccuracy. Existing applications of technology to finger tapping reduce the tapping signal to one-dimensional measures, with researcher-defined features derived from those measures. Objectives(1) To apply a deep learning neural network directly to video of finger tapping, without human-defined measures/features, and determine classification accuracy for idiopathic PD versus controls. (2) To visualise the features learned by the model. Methods152 smartphone videos of 10s finger tapping were collected from 40 people with PD and 37 controls. We down-sampled pixel dimensions and videos were split into 1 s clips. A 3D convolutional neural network was trained on these clips. ResultsFor discriminating PD from controls, our model showed training accuracy 0.91, and test accuracy 0.69, with test precision 0.73, test recall 0.76 and test AUROC 0.76. We also report class activation maps for the five most predictive features. These show the spatial and temporal sections of video upon which the network focuses attention to make a prediction, including an apparent dropping thumb movement distinct for the PD group. ConclusionsA deep learning neural network can be applied directly to standard video of finger tapping, to distinguish PD from controls, without a requirement to extract a one-dimensional signal from the video, or pre-define tapping features.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.