Abstract
Due to the considerable increase of images in everyday life, many applications require a study on their similarity. The main challenge is to find a simple and efficient method to compare and classify image pairs into similar and dissimilar classes. This study presents a new method to image pairs comparison and classification based on the modeling of the Local Dissimilarity Map (LDM). The LDM is a tool for locally measuring the dissimilarity between two binary or grayscale images. It is a measure of dissimilarities based on a modified version of the Hausdorff distance, which allows quantifying locally the dissimilarities between images. This measure is completely without parameters and generic. The image pairs classification (2-class classification) method is structured as follows. First, a statistical model for the LDM is proposed. The model parameters, used as descriptors, are relevant to discriminate similar and dissimilar image pairs. Second, classifiers are applied to compute the classification scores (2-class classification problem). In addition, this approach is robust with respect to geometric transformations such as translation compared to the state-of-the-art similarity measures. Although the main objective of this paper is to apply our approach to image pairs classification, it is also performed on a classification with more than two classes (multi-class classification). Experiments on the well-known image data sets *NIST and on old print data set prove that the proposed method produces comparable, even better results than the state-of-the-art methods in terms of accuracy and <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$F_{1}$ </tex-math></inline-formula> score.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.