Abstract
Abstract A novel shape-based image retrieval is presented in this study. The foreground and background contents of images are strongly concealed, so they are represented individually to reduce their influence on each other in the proposed approach. The Otsu method is employed for segmenting the foreground from the background, and the saliency map and edge map are then clearly identified. Saliency reduces the time cost for feature computation, so salient edges are computed for the foreground and background images based on the selective visual attention model. Autocorrelation-based chordiogram image descriptors are computed separately for the foreground and background images, which are then combined in a hierarchical manner to form the proposed new descriptor. This approach avoids the concealment of foreground and background information, and the new descriptor is rich in geometric and its underlying texture, structure and spatial information. The proposed novel shape-based descriptor performs considerably better than conventional descriptors at content-based image retrieval. The proposed shape descriptor were extensively tested at image retrieval based on the Gardens Point Walking, St Lucia, University of Alberta Campus, Corel 10 k, and self-photographed image data sets. The precision and recall values were compared for the proposed and state-of-the-art-approaches when applied for shape-based image retrieval from these databases. The proposed shape descriptor provided satisfactory retrieval results in the experiments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.