Abstract
A localization method based on monocular vision is proposed to solve the problem of poor flexibility, high cost and unstable accuracy of glue dispensing robot. The method includes the workpiece image feature extraction method based on distribution model and the optimized PNP algorithm based on depth calibration, which can locate the threedimensional coordinates of the workpiece and further generate the gluing track. Firstly, the layout and local coordinates of feature points are determined according to the workpiece model and gluing process, and the feature distribution model and template set are established. Then the image coordinates of feature points are extracted step by step by using workpiece contour features and image gray features, combining multi template and multi angle matching with shape detection, and using acceleration strategies such as image pyramid and angle layer by layer subdivision. Finally, the PNP algorithm is optimized in the Z direction through the depth calibration method to realize the high-precision positioning of the workpiece. The localization experiments of various types of reducer shells under different imaging environments were carried out. The experimental results show that the method has better feature extraction effect for workpieces with complex structure in chaotic environment, and the maximum localization error in one direction is within ± 0.5 mm, which meets the application needs of robot glue positioning. The method can detect the offset of 6 degrees of freedom of the target workpiece at the same time, which has a wider application than the general 2D visual localization method. It can also be used for the localization of parts in other scenes.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.