Abstract

Question routing (QR) aims to route newly submitted questions to the potential experts most likely to provide answers. Many previous works formalize the question routing task as a text matching and ranking problem between questions and user profiles, focusing on text representation and semantic similarity computation. However, these works often fail to extract matching features efficiently and lack deep contextual textual understanding. Moreover, we argue that in addition to the semantic similarity between terms, the interactive relationship between question sequences and user profile sequences also plays an important role in matching. In this paper, we proposed two BERT-based models called QR-BERTrep and QR-tBERTint to address these issues from different perspectives. QR-BERTrep is a representation-based feature ensemble model in which we integrated a weighted sum of BERT layer outputs as an extra feature into a Siamese deep matching network, aiming to address the non-context-aware word embedding and limited semantic understanding. QR-tBERTint is an interaction-based model that explores the interactive relationships between sequences as well as the semantic similarity of terms through a topic-enhanced BERT model. Specifically, it fuses a short-text-friendly topic model to capture corpus-level semantic information. Experimental results on real-world data demonstrate that QR-BERTrep significantly outperforms other traditional representation-based models. Meanwhile, QR-tBERTint exceeds QR-BERTrep and QR-BERTint with a maximum increase of 17.26% and 11.52% in MAP, respectively, showing that combining global topic information and exploring interactive relationships between sequences is quite effective for question routing tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call