Abstract

In this paper, we aim to develop an automatic system to monitor and evaluate worker’s efficiency for smart manufacturing based on human pose tracking and temporal action localization. First, we explore the generative adversarial networks (GANs) to achieve significantly improved estimation of human body joints. Second, we formulate the automated worker efficiency analysis into a temporal action localization problem in which the action video performed by the worker is matched against a reference video performed by a teacher. We extract invariant spatio-temporal features from the human body pose sequences and perform cross-video matching using dynamic time warping. Our proposed human pose estimation method achieves state-of-the-art performance on the benchmark dataset. Our automated work efficiency analysis is able to achieve action localization with an average IoU (intersection over union) score large than 0.9. This represents one of the first systems to provide automated worker efficiency evaluation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.