Abstract
Co-saliency detection refers to the computational process for identification of common but prominent and salient foreground regions in an image. However most of the co-saliency detection methods suffer from the following two limitations. First, co-saliency detection models largely generate superpixel level co-saliency maps that leads to sacrifice of significant information from the pixel level input images. Second, co-saliency detection frameworks mostly involve redesigned models for detection of co-salient objects in an image group, instead of utilization of the existing single image saliency detection models. To address these problems, we propose a novel framework, Co-saliency via Regularized Random Walk Ranking (CR2WR), which provides highly efficient pixel level co-saliency maps and utilizes existing saliency models on a single image to detect co-salient objects in an image sequence. This is achieved by: (1) Introducing Regularized random walk as the ranking function for a two-stage co-saliency detection framework. (2) Novel weighting function to incorporate more image information in graph construction and utilization of normalized Laplacian matrix for efficient cosaliency maps. (3) Generated saliency maps are fused further with high level priors namely, Location and Objectness priors, that enhances detection of co-salient regions. Suitably designed novel objective functions provide an enriched solution. The proposed model is evaluated on challenging benchmark co-saliency datasets. It is demonstrated that the proposed method outperforms prominent state-of-the-art methods in terms of efficiency and computational time.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.