Abstract

Machine-vision-based defect detection, instead of manual visual inspection, is becoming increasingly popular. In practice, images of the upper surface of cableway load sealing steel wire ropes are seriously affected by complex environments, including factors such as lubricants, adhering dust, natural light, reflections from metal or oil stains, and lack of defect samples. This makes it difficult to directly use traditional threshold-segmentation-based or supervised machine-learning-based defect detection methods for wire rope strand segmentation and fracture defect detection. In this study, we proposed a segmentation-template-based rope strand segmentation method with high detection accuracy, insensitivity to light, and insensitivity to oil stain interference. The method used the structural characteristics of steel wire rope to create a steel wire rope segmentation template, the best coincidence position of the steel wire rope segmentation template on the real-time edge image was obtained through multiple translations, and the steel wire rope strands were segmented. Aiming at the problem of steel wire rope fracture defect detection, inspired by the idea of dynamic background modeling, a steel wire rope surface defect detection method based on a steel wire rope segmentation template and a timely spatial gray sample set was proposed. The spatiotemporal gray sample set of each pixel in the image was designed by using the gray similarity of the same position in the time domain and the gray similarity of pixel neighborhood in the space domain, the dynamic gray background of wire rope surface image was constructed to realize the detection of wire rope surface defects. The method proposed in this paper was tested on the image set of Z-type double-layer load sealing steel wire rope of mine ropeway, and compared with the classic dynamic background modeling methods such as VIBE, KNN, and MOG2. The results show that the purposed method is more accurate, more effective, and has strong adaptability to complex environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.