Abstract

Image search engines show results differently from general Web search engines in three key ways: (1) most Web-based image search engines adopt the two-dimensional result placement instead of the linear result list; (2) image searches show snapshots instead of snippets (query-dependent abstracts of landing pages) on search engine result pages (SERPs); and (3) pagination is usually not (explicitly) supported on image search SERPs, and users can view results without having to click on the next page'' button. Compared with the extensive study of user behavior in general Web search scenarios, there exists no thorough investigation how the different interaction mechanism of image search engines affects users' examination behavior. To shed light on this research question, we conducted an eye-tracking study to investigate users' examination behavior in image searches. We focus on the impacts of factors in examination including position, visual saliency, edge density, the existence of textual information, and human faces in result images. Three interesting findings indicate users' behavior biases: (1) instead of the traditional Golden Triangle'' phenomena in the user examination patterns of general Web search, we observe a middle-position bias, (2) besides the position factor, the content of image results (e.g., visual saliency) affects examination behavior, and (3) some popular behavior assumptions in general Web search (e.g., examination hypothesis) do not hold in image search scenarios. We predict users' examination behavior with different impact factors. Results show that combining position and visual content features can improve prediction in image searches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call