Abstract

. Image search engines (e.g. Google Image Search, Bing Image Search) mostly depends on the given query surrounding text features. It increases complexity to interpret users search intention only by giving single query keywords and this leads to ambiguous and noisy search results. To solve the ambiguity in the image search, consider visual information along with the text features. In this approach user has to click a single search return image and the search results are re-ranked based on the similarity in visual and textual content. Our work is to capture user search intention by doing one-click image search has four steps. Adaptive weight categories are predefined to category the query image and this helps to re-rank the text based search results. Keywords are expanded based on the selected query image visual content that helps to capture user intention. Based on the expanded keywords image pool get expanded that contain more relevant images with the query image. Expanded keywords are also used to expand the query image that lead to multiple positive similar images and the similarity metrics are learned for page re-ranking. Re-ranking of similarity images to the query image based on photo quality assessment to provide better search results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call