Manufacturing workers face prolonged strenuous physical activities, impacting both financial aspects and their health due to work-related fatigue. Continuously monitoring physical fatigue and providing meaningful feedback is crucial to mitigating human and monetary losses in manufacturing workplaces. This study introduces a novel application of multimodal wearable sensors and machine learning techniques to quantify physical fatigue and tackle the challenges of real-time monitoring on the factory floor. Unlike past studies that view fatigue as a dichotomous variable, our central formulation revolves around the ability to predict multilevel fatigue, providing a more nuanced understanding of the subject's physical state. Our multimodal sensing framework is designed for continuous monitoring of vital signs, including heart rate, heart rate variability, skin temperature, and more, as well as locomotive signs by employing inertial motion units strategically placed at six locations on the upper body. This comprehensive sensor placement allows us to capture detailed data from both the torso and arms, surpassing the capabilities of single-point data collection methods. We developed an innovative asymmetric loss function for our machine learning model, which enhances prediction accuracy for numerical fatigue levels and supports real-time inference. We collected data on 43 subjects following an authentic manufacturing protocol and logged their self-reported fatigue. Based on the analysis, we provide insights into our multilevel fatigue monitoring system and discuss results from an in-the-wild evaluation of actual operators on the factory floor. This study demonstrates our system's practical applicability and contributes a valuable open-access database for future research.
Read full abstract