Abstract

AbstractPretrained language models(PLMs) and additional features have been used in rumor detection with excellent performance. However, on the one hand, some recent studies find one of its critical challenges is the significant gap of objective forms in pretraining and fine-tuning, which restricts taking full advantage of knowledge in PLMs. On the other hand, text contents are condensed and full of knowledge entities, but existing methods usually focus on the textual contents and social contexts, and ignore external knowledge of text entities. In this paper, to address these limitations, we propose a Prompt-based External Knowledge Integration Network(PEKIN) for rumor detection, which incorporates both prior knowledges of rumor detection tasks and external knowledge of text entities. For one thing, unlike the conventional “pretrain, finetune" paradigm, we propose a prompt-based method, which brings prior knowledge to help PLMs understand the rumor detection task and better stimulate the rich knowledge distributed in PLMs. For another, we identify entities mentioned in the text and then get these entities’ annotations from a knowledge base. After that, we use these annotations contexts as external knowledge to provide complementary information. Experiments on three datasets showed that PEKIN outperformed all compared models, significantly beating the old state-of-the-art on Weibo dataset. KeywordsRumor detectionPrompt-basedExternal knowledge

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call