Abstract
We propose a perceptually-based system for pattern retrieval and matching. The central idea of the work is that similarity judgment has to be modeled along perceptual dimensions. Hence, we detect basic visual categories that people use in judgment of similarity, and design a computational model which accepts patterns as input, and depending on the query, produces a set of choices that follow human behavior in pattern matching. To understand how humans perceive color patterns we performed a subjective experiment The experiment yielded five perceptual criteria used in comparison between color patterns (vocabulary), as well as a set of rules governing the use of these criteria in similarity judgment (grammar). This paper describes the actual implementation of the perceptual criteria and rules in the image retrieval system. Following the processing typical for human vision, we designed a system to: (a) extract perceptual features from the vocabulary and (b) perform the comparison between the patterns according to the grammar rules. We propose new color and texture features, as well as new distance functions that correlate with human performance. The performance of the system is illustrated with numerous examples from image databases from different application domains.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.