In this paper, we study the problem of predicting popularities of questions in Community Question Answering (CQA). To address this problem, we propose a Posterior Attention Recurrent Point Process Model (PARPP) to take both the interaction of users and the Matthew effect into account for question popularity prediction. Our PARPP uses long short-term memory (LSTM) to encode the observed history and another LSTM network to record each step of decoding information. At each decoding step, it uses prior attention to capture answers that have a greater impact on the problem. When a new answer is observed, it uses Bayes’ rule to modify prior attention and obtain posterior attention. Then, the posterior attention is used to update the decoding status. We further introduce a convergence strategy to capture the Matthew effect in CQA. We conduct experiments on a Zhihu dataset crawled from a famous Chinese CQA forum. The experimental results show that our model outperforms several state-of-the-art methods. We further analyze the attention mechanism in our model. Our analysis shows that the proposed attention mechanism can better capture the impact of each answer on the future popularity of the question, which makes our model more interpretable. Our study would shed light on other similar studies such as answer ranking in response to the question and finding experts who have expertise on the topics of the questions.
Read full abstract