Abstract

Due to the complexity of fabric texture and the diversity of defect types, fabric defect detection is quite challenging. At present, fabric defect detection algorithms based on deep learning have achieved good detection results, but there are still some key issues to be solved. First, the existing network models remain unchanged once they have been built. When they are used to detect defects of different kinds of fabrics, the network model cannot be adjusted flexibly according to the characteristics of the fabrics, which reduces the efficiency of algorithm detection. Second, the imbalanced category of fabric defect samples makes model training more challenging. Moreover, the number of defective pixels is very small compared with the number of pixels in the whole image, which further increases the difficulty of fabric defect detection methods. To solve these problems, we propose a pixel-level pruning end-to-end deep supervision DSUNet++ architecture for fabric defect detection. The DSUNet++ architecture consists of an encoder, a decoder, and a series of cascade operations for fusing the detailed features of the shallow layer and the abstract features of the deep layer. The deep supervision is embedded into the outputs of different depths in the DSUNet++, which can prune the network reasonably according to the characteristics of different kinds of fabrics, so as to balance the depth, speed and precision of the network. Furthermore, the cross-entropy loss function weighted by the median frequency CEloss_MFB is introduced to overcome the problem of imbalanced fabric defect sample categories and detection rate decrease of small pixel defects. The experimental results show that the average detection rate of the method is 97.68% and 99.01% in the defect detection of raw fabric and patterned fabric, respectively.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.