Abstract

Human behavior understanding is a well-known area of interest for computer vision researchers. This discipline aims at evaluating several aspects of interactions among humans and system components to ensure long term human well-being. The robust human posture analysis is a crucial step towards achieving this target. In this paper, the deep representation learning paradigm is used to analyze the articulated human posture and assess the risk of having work-related musculoskeletal discomfort in manufacturing industries. Particularly, we train a deep residual convolutional neural network model to predict body joint angles from a single depth image. Estimated joint angles are essential for ergonomists to evaluate ergonomic assessment metrics. The proposed method applies the deep residual learning framework that has demonstrated impressive convergence speed and generalization capabilities in addressing different vision tasks such as object recognition, localization and detection. Moreover, we extend the state-of-the-art data generation pipeline to synthesize a dataset that features simulations of manual tasks performed by different workers. An inverse kinematics stage is proposed to generate the corresponding ground truth joint angles. Experimental results demonstrate the generalization performance of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call