Abstract
Saliency detection is an important and challenging research topic due to the variety and complex of the background and saliency regions. In this paper, we present a novel unsupervised saliency detection approach by exploiting a learning-based ranking framework. First, the local linear regression model is adopted to simulate the local manifold structure of every image element, which is approximately linear. Using the background queries from the boundary prior, we construct a unified objective function to globally minimize all the errors of the local models for the whole image element points. The Laplacian matrix is learned via optimizing the unified objective function. Low-level image features as well as high-level semantic information extracted from deep neural networks are used for the Laplacian matrix learning. Based on the learnt Laplacian matrix, the saliency of the image element is measured as the relevance ranking to the background queries. The foreground queries are obtained from the background-based saliency and the relevance ranking to the foreground queries is calculated in the same way as the background-based saliency. Second, we calculate an enhanced similarity matrix by fusing two different-level deep feature metrics through cross diffusion. A propagation algorithm uses this enhanced similarity matrix to better exploit the intrinsic relevance of similar regions and improve the saliency ranking results effectively. Results on four benchmark datasets with pixel-wise accurate labelling demonstrate that the proposed unsupervised method shows better performance compared with the newest state-of-the-art methods and is competitive with deep learning-based methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.