Abstract
Abstract Human–robot collaboration (HRC) has been identified as a highly promising paradigm for human-centric smart manufacturing in the context of Industry 5.0. In order to enhance both human well-being and robotic flexibility within HRC, numerous research efforts have been dedicated to the exploration of human body perception, but many of these studies have focused only on specific facets of human recognition, lacking a holistic perspective of the human operator. A novel approach to addressing this challenge is the construction of a human digital twin (HDT), which serves as a centralized digital representation of various human data for seamless integration into the cyber-physical production system. By leveraging HDT, performance and efficiency optimization can be further achieved in an HRC system. However, the implementation of visual perception-based HDT remains underreported, particularly within the HRC realm. To this end, this study proposes an exemplary vision-based HDT model for highly dynamic HRC applications. The model mainly consists of a convolutional neural network that can simultaneously model the hierarchical human status including 3D human posture, action intention, and ergonomic risk. Then, on the basis of the constructed HDT, a robotic motion planning strategy is further introduced with the aim of adaptively optimizing the robotic motion trajectory. Further experiments and case studies are conducted in an HRC scenario to demonstrate the effectiveness of our approach.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.