Abstract

Surface defect detection (SDD) plays an extremely important role in the manufacturing stage of products. However, this is a fundamental yet challenging task, mainly because the intraclass defects have large differences in shape and distribution, and low contrast between the object regions and background, and it is difficult to adapt to other materials. To address this problem, we propose a complementary adversarial network-driven SDD (CASDD) framework to automatically and accurately identify various types of texture defects. Specifically, CASDD consists of an encoding–decoding segmentation module with a specially designed loss measurement and a novel complementary discriminator mechanism. In addition, to model the defect boundaries and enhance the feature representation, the dilated convolutional (DC) layers with different rates and edge detection (ED) blocks are also incorporated into CASDD. Moreover, a complementary discrimination strategy is proposed, which employs two independent yet complementary discriminator modules to optimize the segmentation module more effectively. One discriminator identifies contextual features of the object regions in the input defect images, while the other discriminator focuses on edge detail differences between the ground truth and the segmented image. To obtain more edge information during training, a new composite loss measurement containing edge information and structural features is designed. Experimental results show that CASDD can be suitable for defect detection on four real-world and one artificial defect database, and its detection accuracy is significantly better than the state-of-the-art deep learning methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.