Abstract

A three-dimensional (3D)-virtual calibration and visual servo are implemented for augmented reality (AR)-assisted peg-in-hole microassembly operations. By employing 3D model and ray casting, the 3D coordinates on virtual mating rod correspondent to the two-dimensional (2D) virtual image points are extracted. The detecting and tracking of image feature points for calibration is carried out by the proposed algorithm of regional template matching (TM) and scanning with edge fitting (RTM-SEF). For achieving subpixel error between the feature points in real and virtual images, a coarse-fine virtual calibration method is proposed. In regard to the image viewed by the real and virtual cameras, a calibrated virtual camera is utilized to track the mating rod. A visual servo control law including coarse and fine tuning is proposed to ensure sub-pixel error between the most important feature point in the real and virtual images. The AR technology is mainly employed in the alignment between micropeg and mating hole for inserting a micropeg of diameter 80 μm with length 1–1.4 mm into a mating rod with 100 μm hole.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.