Abstract

In this paper, we aim to develop an automatic system to monitor and evaluate worker’s efficiency for smart manufacturing based on human pose tracking and temporal action localization. First, we explore the generative adversarial networks (GANs) to achieve significantly improved estimation of human body joints. Second, we formulate the automated worker efficiency analysis into a temporal action localization problem in which the action video performed by the worker is matched against a reference video performed by a teacher. We extract invariant spatio-temporal features from the human body pose sequences and perform cross-video matching using dynamic time warping. Our proposed human pose estimation method achieves state-of-the-art performance on the benchmark dataset. Our automated work efficiency analysis is able to achieve action localization with an average IoU (intersection over union) score large than 0.9. This represents one of the first systems to provide automated worker efficiency evaluation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call