Abstract
A key aspect of textile production is the use of automatic defect detection methods for quality control, which is a critical part of the current fabric defect detection system. Benefiting from the explosive development of convolutional neural networks, it has been proved to be feasible to apply it to salient object detection model in fabric defect detection, but how to learn more robust features and better feature fusion for saliency are still complicated tasks. This article presents a novel saliency detection method for fabric defect detection based on both top-down and bottom-up saliency inference combined with two-way information. The top-down process is to infer high-level saliency by gradually using higher-level and richer semantic features on the basis of the backbone. Then a self-fusion enhanced representation module is proposed, which simulates the parallel processing mechanism with residual connection in a top-down manner to generate robust features and effectively characterize the texture features of complex fabrics. In addition, in the down-top process, the interactive feature fusion module is designed, which is used for coarse-to-fine saliency estimation, where the high-level saliency is gradually integrated with finer lower-level features to obtain a fine-grained result. Finally, fabric defect detection is localized by segmenting the generated saliency map. Extensive experimental results on two kinds of fabric image datasets have demonstrated that the proposed TBINet can localize the defect region with high accuracy, and perform favorably against seven state-of-the-art methods on eight widely tested metrics.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.