Abstract

Smart manufacturing promotes the demand of new interfaces for communication with autonomies such as big data analysis, digital twin, and self-decisive control. Collaboration between human and the autonomy becomes imperative factor for improving productivity. However, current human-machine interfaces (HMI) such as 2D screens or panels require human knowledge of process and long-term experience to operate, which is not intuitive for beginning workers or is designed to work with the autonomy. This study proposes a human interface framework of cyber-physical system (CPS) based on virtual reality, named as immersive and interactive CPS (I2CPS), to create an interface for human-machine-autonomy collaboration. By combination of data-to-information protocol and middleware, MTConnect and Robot Operating System (ROS), heterogeneous physical systems were integrated with virtual assets such as digital models, digital shadows, and virtual traces of human works in the virtual reality (VR) based interface. All the physical and virtual assets were integrated in the interface that human, autonomy, and physical system can collaborate. Applying constraints in the VR interface and deploying virtual human works to industrial robots were demonstrated to verify the effectiveness of the I2CPS framework, showing collaboration between human and autonomy: augmentation of human skills by autonomy and virtual robot teaching to generate automatic robot programs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call