Abstract

Gathering information and making decisions based on retrieved information are the important tasks that every intelligence analyst is doing. User modeling techniques have been exploited to help analysts to search for information effectively. To justify the effects of any user modeling technique on helping analysts retrieve quality documents relevant to their tasks, we need to have a comprehensive evaluation method, which assesses the improvement of retrieval performance and user performance. In this paper, we describe our evaluation of a cognitive user model for information retrieval with regards to retrieval performance. Our user model captures user intent dynamically by analyzing behavioral information from retrieved relevant documents for improving the retrieval performance and user's performance. In this evaluation, we assess the user model's short-term effects on a single query, and the user model's long-term effects on the whole search session. We compare our approach with the best traditional approach for relevance feedback in information retrieval, the Ide dec-hi, which is the approach of modifying queries using term frequency from relevant/non-relevant documents. We use the oldest collection of information retrieval on aerodynamics called CRANFIELD. The results of this evaluation show that by exploring user intent, we achieve competitive performance in the feedback run compared to Ide dec-hi. At the same time, our user model approach offers the advantages of retrieving more quality documents at the initial run compared to the term frequency inverted document frequency (TFIDF) approach. Our results have shown that our user modeling approach can be used to improve efficiency, learnability and interactivity of an information retrieval system.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call