Abstract

Recently, researchers have paid more attention to salient object detection of strip steel surface defects, but there are lots of difficulties, such as a mass of noise, blurry defect boundaries, complex backgrounds, and various types of defects. Existing image salient object detection methods fail to overcome the above challenging scenes. To address them, in this paper, we propose a novel Two-Stage Edge Reuse Network (TSERNet), which consists of two stages, <i>i.e</i>., prediction and refinement. In the first prediction stage, we construct a primary net based on the encoder-decoder architecture. The encoder not only extracts multi-scale features but also generates an edge map. In the decoder, we propose a novel Edge-aware Foreground-Background Integration (EFBI) module to distinguish foreground and background through edge features and reverse attention mechanism, and exploit the decoder to generate an initial saliency map. In the second refinement stage, we construct a sub-net based on the same architecture as that of the first stage. Its encoder extracts features from the initial saliency map, and its decoder deploys the Edge-aware Refinement Module (ERM), which reuses the edge map generated from the first stage, to enhance these features for purifying the initial saliency map, resultting in the final saliency map. Comprehensive experiments on the public dataset show that our proposed TSERNet is consistently superior to 22 relevant state-of-the-art methods. The code and results of our method are available at https://github.com/monxxcn/TSERNet.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.