Abstract

Skeleton-based action recognition has rapidly become one of the most popular and essential research topics in computer vision. The task is to analyze the characteristics of human joints and accurately classify their behaviors through deep learning technology. Skeleton provides numerous unique advantages over other data modalities, such as robustness, compactness, noise immunity, etc. In particular, the skeleton modality is extremely lightweight, which is especially beneficial for deep learning research in low-resource environments. Due to the non-European nature of skeleton data, Graph Convolution Network (GCN) has become mainstream in the past few years, leveraging the benefits of processing topological information. However, with the explosive development of transformer methods in natural language processing and computer vision, many works have applied transformer into the field of skeleton action recognition, breaking the accuracy monopoly of GCN. Therefore, we conduct a survey using transformer method for skeleton-based action recognition, forming of a taxonomy on existing works. This paper gives a comprehensive overview of the recent transformer techniques for skeleton action recognition, proposes a taxonomy of transformer-style techniques for action recognition, conducts a detailed study on benchmark datasets, compares the algorithm accuracy of standard methods, and finally discusses the future research directions and trends. To the best of our knowledge, this study is the first to describe skeleton-based action recognition techniques in the style of transformers and to suggest novel recognition taxonomies in a review. We are confident that Transformer-based action recognition technology will become mainstream in the near future, so this survey aims to help researchers systematically learn core tasks, select appropriate datasets, understand current challenges, and select promising future directions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.