Abstract

Existing fully convolutional networks (FCNs) - based salient object detection (SOD) methods have achieved great performance by integrating diverse multi-scale context information. However, the performance of context information directly obtained by single dilated convolution has limitations because the introduction of dilated convolution with different filling rates will cause the problem of local information loss, which limits the prediction accuracy of the model. For that, in this paper, a novel Aggregating Dense and Attentional Multi-scale Feature Network (DAMFNet) is designed to generate high-quality feature representations for accurate SOD task. More specifically, we first propose a dense-depth feature exploration (DDFE) module to adequately capture the robust multi-scale and multi-receptive field context information by utilizing parallel integrated convolution (PIC) blocks and dense connections for improving the model ability of locating salient objects and refining object details. Afterwards, we develop a multi-scale channel attention enhancement (MCAE) module to further enhance the selection of the salient objects information in the feature channels by integrating multiple attentional features with diverse perspectives. The proposed DAMFNet method has been broadly evaluated on five public SOD benchmark datasets and the extensive experimental results demonstrate that our DAMFNet method has superior advantages compared to 18 state-of-the-art SOD methods under different evaluation metrics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.