Inadequate training poses a significant risk of injury among young firefighters. Although Human Activity Recognition (HAR) algorithms have shown potential in monitoring and evaluating performance, most existing studies focus on daily activities and have difficulty distinguishing complex firefighting tasks. This study introduces the Internet of things (IoT)-based wearable firefighting activity recognition (IoT-FAR) system which employs a multi-modal sensor fusion approach to achieve comprehensive activity recognition during firefighting training. The IoT-FAR comprises five wearable body sensor nodes and a coordinator node. This study explores the significance of features extracted from the surface electromyography, heart rate, and inertial measurement units in firefighting training activity recognition. A hybrid machine learning (HML)-based network is proposed, which integrates three models: one trained with all features (MA), another with upper body features (MU), and a third with lower body features (ML). The proposed HML-SVM-RBF1-RF2 network achieves superior performance, with a mean recall of 93.94%, mean precision of 90.94%, and a mean accuracy rate of 98.29%. Additionally, the study introduces the specialized firefighting training associated activities (SFTAA) dataset, which includes endurance training activities involving self-contained breathing apparatus (SCBA) conducted by eighteen firefighters. This dataset represents preliminary work towards building a comprehensive dataset covering various events and scenarios for tracking firefighter activities. The IoT-FAR system also demonstrates the potential use of misclassified activities as evaluation metrics for assessing firefighter training performance.
Read full abstract