Abstract
The application of pose assessment on rehabilitation training has gradually received attention in recent years. However, current evaluation indicators of these methods are mostly based on the score or scoring function that defined by users, which is too subjective and hard to be used by patients directly. In this paper, we conceptualized a new idea for pose matching, namely pose-guided matching that aims at providing objective and accurate score, feedback and guidance (i.e. guided) to the patients when the pose is compared to the standard pose. More specifically, we proposed a pair-based Siamese Convolutional Neural Network (SCNN) abbreviated ST-AMCNN to realize the idea of pose-guided matching on the eight-section brocade dataset which is one of the most representative traditional rehabilitation exercises in China. We simplified the multi-stages pose matching by merging two standalone modules (i.e. alignment and matching module) into a one-stage task. Such that, only one loss function is required to tune, which reduces the computational complexity. On top of the Spatial Transformer Networks (STN) employed as an alignment module, we proposed a new Attention-based Multi-Scale Convolution (AMC) to match different posture parts (i.e. multi-scale). Furthermore, the proposed AMC can assign more weight to useful pose features as opposed to other irrelevant features e.g. background features for performance gain. Finally, Gradient-weighted Class Activation Mapping (Grad-CAM) is adopted to visualize the matching result for the learner. Experimental results indicate that ST-AMCNN achieves a competitive performance than the state-of-the-art models and can provide accurate feedback for learners on rehabilitation training. Simultaneously, the proposed method is also deployed in client software for testing.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.