Abstract
Factories focusing on digital transformation accelerate their production and surpass their competitors by increasing their controllability and efficiency. In this study, the data obtained by image processing with the aim of digital transformation was transferred to the collaborative robot arm with 5G communication and the robot arm was remotely controlled. A 3D-printed humanoid hand is mounted on the end of the robot arm for bin picking. Each finger is controlled by five servo motors. For finger control, the user wore a glove, and the finger positions of the user were transferred to the servo motors thanks to each flex sensor attached to the glove. In this way, the desired pick and place process is provided. The position control of the robot arm was realized with image processing. The gloves worn by the user were determined by two different YOLO (You only look once) methods. YOLOv4 and YOLOv5 algorithms were compared by using Python software language in object detection. While the highest detection accuracy obtained with the YOLOv4 algorithm during the test phase was 99.75% in the front camera, it was 99.83% in the YOLOv5 algorithm; YOLOv4 detection accuracy was the highest in the side camera of 97.59%, and YOLOv5 detection accuracy was 97.9%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.