Abstract
Salient object detection in wavelet domain has recently begun to attract researchers’ effort due to its desired ability to provide multi-scale analysis of an image simultaneously in both frequency and spatial domains. The proposed algorithm exploits the inherent multi-scale structure of the double-density dual-tree complex-oriented wavelet transform (DDDTCWT) to decompose each input image into four approximate sub-band images and 32 high-pass detailed sub-band images at each scale. These 32 detailed high-pass sub-bands at each scale are adequate to represent singularities of any geometric object with high precision and to mimic zooming-in and zooming-out process of human vision system. In the proposed model, we first compute a rough segmented saliency map (RSSM) by fusing multi-scale edge-to-texture features generated from DDDTCWT with segmentation results obtained from bipartite graph partitioning-based segmentation approach. Then, each pixel in RSSM is categorized into either background region or salient region based on a threshold. Finally, the pixels of the two regions are considered as samples to be drawn from a multivariate kernel function whose parameters are estimated using expectation maximization algorithm, to generate a saliency map. The performance of the proposed model is evaluated in terms of precision, recall, F-measure, area under the ROC curve and computation time using six publicly available image datasets. Extensive experimental results on six benchmark datasets demonstrate that the proposed model outperformed the existing 29 state-of-the-art methods in terms of F-measure on all five datasets, recall on four datasets and area under ROC curve on two datasets. In terms of mean recall value, mean F-measure value and mean AUC value on all six datasets, the proposed method outperforms all state-of-the-art methods. The proposed method also takes comparatively less computation time in comparison with many existing spatial domain methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.