Abstract
This paper proposes a saliency detection method via two-stage absorbing Markov chain based on background and foreground for detecting salient objects in images. Firstly, image preprocessing is performed, followed by convex hull construction and superpixel segmentation, to prepare for subsequent processing. Secondly, according to the boundary connectivity, the superpixels with lower background probability value in the candidate boundary background set B0 are deleted, and the boundary background set B1 is obtained. With the saliency values of the nodes in the boundary-prior saliency map Sbg1, the background seeds are added appropriately in the region outside the candidate boundary background set B0 and the convex hull H, and the background seed set B is obtained after update. Then, the background-absorbing Markov chain is constructed to generate background-absorbing saliency map Sbg2. By fusing the saliency maps Sbg1 and Sbg2, the first-stage background-based saliency map Sbg is obtained. Thirdly, in the range of the convex hull H, the foreground seed set F is determined according to the saliency map Sbg. Then, the foreground-absorbing Markov chain is constructed, to obtain the second-stage foreground-absorbing saliency map Sfg. Finally, the saliency maps Sbg and Sfg of the two stages are combined to obtain a fused saliency map S, and the final saliency map S∗ is obtained after optimization through smoothing mechanism. Compared with the traditional methods, the performance of the proposed method is significantly improved. The proposed method is tested on three public image datasets, and it shows great accuracy in detecting salient objects.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Visual Communication and Image Representation
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.