In the manufacturing environment, prioritizing the well-being of workers is both essential and challenging. Recognizing and monitoring workers’ actions in real time is crucial in ensuring safety and health. This study introduces innovative approaches leveraging smart manufacturing technologies and Graph Neural Networks (GNNs) to enhance human action recognition using accessible skeletal data. Traditional Neural Networks encounter limitations in processing readily accessible human-related data, such as image and skeletal data, prompting the development of the Graph Residual Attention SAGE Network (GRASNet). GRASNet exhibits superior capabilities in capturing intricate local connectivity within graph data, ensuring a delicate balance of complexity and efficiency. Furthermore, experimental validations on two manufacturing-related datasets, derived from the NTU RGB + D 120 benchmark dataset, underscore GRASNet’s ability for high-accuracy monitoring of human abnormal behaviors and health issues, positioning it ahead of other existing deep learning models.
Read full abstract