Abstract

Saliency regions attract more human’s attention than other regions in an image. Low- level and high-level features are utilized in saliency region detection. Low-level features contain primitive information such as color or texture while high-level features usually consider visual systems. Recently, some salient region detection methods have been proposed based on only low-level features or high-level features. It is necessary to consider both low-level features and high-level features to overcome their limitations. In this paper, a novel saliency detection method is proposed which uses both low-level and high-level features. Color difference and texture difference are considered as low-level features, while modeling human’s attention to the center of the image is considered as a high-level feature. In this approach, color saliency maps are extracted from each channel in Lab color space; and texture saliency maps are extracted using wavelet transform and local variance of each channel. Finally, these feature maps are fused to construct the final saliency map. In the post processing step, morphological operators and the connected components technique are applied on the final saliency map to construct further contiguous saliency regions. We have compared our proposed method with four state-of-the-art methods on the MSRA (Microsoft Security Response Alliance) database. The averaged F-measure over 1000 images of the MSRA dataset is achieved 0.7824. Experimental results demonstrate that the proposed method outperforms the existing methods in saliency region detection.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.