Abstract
Web-based image search engines mostly rely on surrounding textual features. It becomes very difficult for them to interpret users' search intention only by query keywords and this leads to ambiguous and noisy search results which do not satisfy users perspective. In order to solve this ambiguity in text based image retrieval we used visual information of query image. In this project, we are implementing a novel Internet image search approach where user is asked to click on one query image with less effort and visually relevant images from a huge database are retrieved. Our main perspective is to capture the users' search intention from this one-click query image in following steps. In this system, the user first submits query keyword. A pool of images is retrieved by text-based search. Then the user is asked to select a query image from the image pool. Images in the pool are re-ranked based on their color and texture similarities to the query image. These similarities are computed using Euclidean distance method. A query-specific color similarity metric and a query specific textual similarity metric are learned from the selected examples and used to rank images. These similarity metrics reflect users' intention at a finer level since every query image has different metrics.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.