Abstract

Understanding user intent is essential for various retrieval tasks. By leveraging contextual information within sessions, e.g., query history and user click behaviors, search systems can capture user intent more accurately and thus perform better. However, most existing systems only consider intra-session contexts and may suffer from the problem of lacking contextual information, because short search sessions account for a large proportion in practical scenarios. We believe that in these scenarios, considering more contexts, e.g., cross-session dependencies, may help alleviate the problem and contribute to better performance. Therefore, we propose a novel Hybrid framework for Session Context Modeling (HSCM), which realizes session-level multi-task learning based on the self-attention mechanism. To alleviate the problem of lacking contextual information within current sessions, HSCM exploits the cross-session contexts by sampling user interactions under similar search intents in the historical sessions and further aggregating them into the local contexts. Besides, application of the self-attention mechanism rather than RNN-based frameworks in modeling session-level sequences also helps (1) better capture interactions within sessions, (2) represent the session contexts in parallelization. Experimental results on two practical search datasets show that HSCM not only outperforms strong baseline solutions such as HiNT, CARS, and BERTserini in document ranking, but also performs significantly better than most existing query suggestion methods. According to the results in an additional experiment, we have also found that HSCM is superior to most ranking models in click prediction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call