Abstract

Abstract Fabric defect detection is extremely important for the development of the textile industry, but the existing traditional image processing algorithms are not good enough to detect fabric defects, and the detection efficiency and accuracy of the classical deep learning model is not satisfactory, so this paper proposes an improved fabric defect detection method based on multi-scale fusion of attention mechanism YOLOv7-PCBS. Based on the YOLOv7 network structure, some of the standard convolutions of the backbone network are replaced with Partial Convolution modules, which reduces the amount of network computation and improves the network detection speed; add coordinate attention to enhance the ability of extracting the positional features of tiny defects in fabrics; reconfiguration of the SPPCSPC module to improve small target detection; optimization of Bidirectional Feature Pyramid Network (BiFPN) and design of Tiny- BiFPN for simple and fast multi-scale feature fusion; finally, a novel loss function SIoU with angular loss is introduced to facilitate the fitting of the true and predicted frames and enhance the accuracy of defect prediction. The results show that the algorithm achieves a mAP value of 94.4% on the detection of defects in solid-colored fabrics of six denim materials, which is an improvement of 15.1% compared to the original YOLOv7 algorithm, while the model achieves a frame rate of 59.5 per second. Compared with other traditional deep learning algorithms SSD and Faster-RCNN, the detection accuracies are improved by 21.6% and 15.2%, and the FPS values are improved by 78.1% and 101.0%, respectively. Therefore, the YOLOv7-PCBS fabric defect detection algorithm proposed in this paper makes the fabric defect detection results more accurate while realizing lightweight, which provides an important technical reference for the subsequent improvement of textile quality.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.