Abstract
Although methods based on the fully convolutional neural networks (FCNs) have shown strong advantages in the field of salient object detection, the existing methods still have two challenging issues: insufficient multi-level feature fusion ability and boundary blur. To overcome these issues, we propose a novel salient object detection method based on a multi-feature fusion cross network (denoted MFC-Net). Firstly, to overcome the issue of insufficient multi-level feature fusion ability, inspired by the connection mode of human brain neurons, we propose a novel cross network framework, combined with contextual feature transfer modules (CFTMs) to integrate, enhance and transmit multi-level feature information in an iterative manner. Secondly, to address the issue of blurred boundaries, we effectively enhance the edge features of saliency map by a simple edge enhancement strategy. Thirdly, to reduce the loss of information caused by the saliency map generated by multi-level feature fusion, we use feature fusion modules (FFMs) to learn contextual feature information from multiple angles and then output the resulting saliency map. Finally, a hybrid loss function fully supervises the network at the pixel and object level, optimizing the network performance. The proposed MFC-Net has been evaluated using five benchmark datasets. The performance evaluation demonstrates that the proposed method outperforms other state-of-the-art methods, which proves the superiority of MFC-Net approach.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.