Abstract

For human-robot collaboration (HRC), one of the most practical methods to ensure human safety with a vision-based system is establishing a minimum safe distance. This study proposes a novel integrated mixed reality (MR) system for safety-aware HRC using deep learning and digital twin generation. The proposed approach can accurately measure the minimum safe distance in real-time and provide MR-based task assistance to the human operator. The approach integrates MR with safety-related monitoring by tracking the shared workplace and providing user-centric visualization through smart MR glasses for safe and effective HRC. Two RGB-D sensors are used to reconstruct and track the working environment. One sensor scans one area of the physical environment through 3D point cloud data. The other also scans another area of the environment and tracks the user's 3D skeletal information. In addition, the two partially scanned environments are registered together by applying a fast global registration method to two sets of the 3D point cloud. Furthermore, deep learning-based instance segmentation is applied to the target object's 3D point cloud to increase the registration between the real robot and its virtual robot, the digital twin of the real robot. While only 3D point cloud data are widely used in previous studies, this study proposes a simple yet effective 3D offset-based safety distance calculation method based on the robot's digital twin and the human skeleton. The 3D offset-based method allows for real-time applicability without sacrificing the accuracy of safety distance calculation for HRI. In addition, two comparative evaluations were conducted to confirm the originality and advantage of the proposed MR-based HRC.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call