Abstract

In response to the problem where many existing research models only consider acquiring the temporal information between sequences of continuous skeletons and in response to the lack of the ability to model spatial information, this study proposes a model for recognizing worker falls and lays out abnormal behaviors based on human skeletal key points and a spatio-temporal graph convolutional network (ST-GCN). Skeleton extraction of the human body in video sequences was performed using Alphapose. To resolve the problem of graph convolutional networks not being effective enough for skeletal key points feature aggregation, we propose an NAM-STGCN model that incorporates a normalized attention mechanism. By using the activation function PReLU to optimize the model structure, the improved ST-GCN model can more effectively extract skeletal key points action features in the spatio-temporal dimension for the purposes of abnormal behavior recognition. The experimental results show that our optimized model achieves a 96.72% accuracy for recognition on the self-built dataset, which is 4.92% better than the original model; the model loss value converges below 0.2. Tests were performed on the KTH and Le2i datasets, which are both better than typical classification recognition networks. The model can precisely identify abnormal human behaviors, facilitating the detection of abnormalities and rescue in a timely manner and offering novel ideas for smart site construction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.