Abstract

Image classification usually requires complicated segmentation to separate foreground objects from the background scene. However, the statistical content of a background scene can actually provide very useful information for classification. In this paper, we propose a new hybrid pyramid kernel which incorporates local features extracted from both dense regular grids and interest points for image classification, without requiring segmentation. Features extracted from dense regular grids can better capture information about the background scene, while interest points detected at corners and edges can better capture information about the salient objects. In our algorithm, these two local features are combined in both the spatial and the feature-space domains, and are organized into pyramid representations. In order to obtain better classification accuracy, we fine-tune the parameters involved in the similarity measure, and we determine discriminative regions by means of relevance feedback. From the experimental results, we observe that our algorithm can achieve a 6.37 % increase in performance as compared to other pyramid-representation-based methods. To evaluate the applicability of the proposed hybrid kernel to large-scale databases, we have performed a cross-dataset experiment and investigated the effect of foreground/background features on each of the kernels. In particular, the proposed hybrid kernel has been proven to satisfy Mercer's condition and is efficient in measuring the similarity between image features. For instance, the computational complexity of the proposed hybrid kernel is proportional to the number of features.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.