Abstract

Sketch-based image retrieval is an important research topic in the field of image processing. Hand-drawn sketches consist only of contour lines, and lack detailed information such as color and textons. As a result, they differ significantly from color images in terms of image feature distribution, making sketch-based image retrieval a typical cross-domain retrieval problem. To solve this problem, we constructed a perceptual space consistent with both textures and sketches, and using perceptual similarity for sketch-based texture retrieval. To implement this approach, we first conduct a set of psychological experiments to analyze the similarity of visual perception of the textures, then we create a dataset of over a thousand hand-drawn sketches according to the textures. We proposed a layer-wise perceptual similarity learning method that integrates perceptual similarity, with which we trained a similarity prediction network to learn the perceptual similarity between hand-drawn sketches and natural texture images. The trained network can be used for perceptual similarity prediction and efficient retrieval. Our experimental results demonstrate the effectiveness of sketch-based texture retrieval using perceptual similarity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.