Structured light measurement is widely used in welding seam tracking because of its high precision and robustness. For the narrow butt joint, the positioning method by reconstructing the weld contour is not suitable for the welding of the narrow butt joint because it is difficult for the laser stripe to produce obvious deformation when projected to the weld. In this study, high-quality images with laser stripes and narrow butt joints are captured by the improved structured light vision sensor, which is equipped with an auxiliary light source. A two-step processing framework, including semantic segmentation and groove positioning, is raised to locate the feature point of the narrow butt joint. Firstly, we design the strip pooling ENet (SP-ENet), a real-time network specifically designed to accurately segment narrow weld images. Our proposed network outperforms other classical segmentation networks in terms of segmentation accuracy and proves to be highly suitable for the detection of narrow butt joint welds. Secondly, a combining method of random sample consensus (RANSAC) and iterative fitting to calculate the sub-pixel coordinates of weld feature points accurately. Finally, a trajectory smoothing model based on the Kalman filter is proposed to reduce the trajectory jitter. The above methods were tested on a self-built robotic welding experimental platform. Experimental results show that the proposed method can be used for real-time detection and positioning of narrow butt joints. The positioning trajectory is smooth, with most positioning errors less than 2 pixels. The mean tracking error reaches 0.207 mm, which can meet the practical welding requirements.
Read full abstract