Abstract

Abstract Graph-based saliency detection approaches have gained great popularity due to the simplicity and efficiency of graph algorithms. In these approaches, the saliency values of image elements are ranked by the similarity of image elements with foreground or background cues via graph-based ranking. However, in previous methods, the similarity between any two image elements on the affinity graph is computed by manually set functions which are sensitive to function parameters, and the constructed graph may not reveal the essentially relevance between feature vectors extracted from different image elements. In addition, during the saliency ranking process, all the initial labels contribute equally to the ranking function while the global saliency confidence of each image element is not taken into consideration. In order to address these two issues, we propose a bottom-up saliency detection approach by affinity graph learning and weighted manifold ranking. An unsupervised learning approach is introduced to learn the affinity graph based on image data self-representation. By setting image boundary superpixels as background seeds, the global saliency confidence prior implied in the affinity matrix is utilized to weight the saliency ranking. In such a manner, the superpixels with higher saliency confidences will be assigned higher saliency values in the final saliency map and the background superpixels can be efficiently suppressed. Comprehensive evaluations on three challenge datasets indicate that our algorithm universally surpasses other unsupervised graph based saliency detection methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.