Abstract
ABSTRACT Molds are assembled manually due to their inherent characteristics of low-volume and high-variety production. Given the ergonomic risks caused by heavy-handling and repetitive tasks and diverse requirements in mold assembly, collaborative robots offer adaptability and ease of reconfiguration, making them potential solutions to these challenges. This study introduces a monitoring model for human–robot collaborative mold assembly using two cobots. This model encompasses manual progress monitoring for cobot execution and position-sharing modules. Manual task actions are detected using the You-Only-Look-Once-v8 Nano model. Detected actions are subsequently classified into different states. These identified states are relayed to the cobots, enabling early execution and controlling the cobots’ entry into the assembly area. This study proposes a position-sharing approach to prevent collisions by receiving coordinates via Modbus between the two cobots. Most existing research has developed separate models for action and part recognition, excluding the utilization of recognition results to enable early execution. This study contributes to a novel model for monitoring manual progress to enable early execution of subsequent tasks and facilitate communication through position sharing during human–robot collaborative mold assembly tasks. The results show that assembly time and the risk of collisions between cobots can be reduced in human–robot collaborative mold assembly tasks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Computer Integrated Manufacturing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.