With the development of online social media, the number of various news has exploded. While social media provides an information platform for news release and dissemination, it also makes fake news proliferate, which may cause potential social risks. How to detect fake news quickly and accurately is a difficult task. The multimodal fusion fake news detection model is the current research focus and development trend. However, in terms of content, most existing methods lack the mining of background knowledge hidden in the news content and ignore the connection between background knowledge and existing knowledge system. In terms of the propagation chain, the research tends to emphasize only the single chain from the previous communication node, ignoring the intricate communication chain and the mutual influence relationship among users. To address these problems, this paper proposes a multimodal fake news detection model, A-KWGCN, based on knowledge graph and weighted graph convolutional network (GCN). The model fully extracted the features of the content and the interaction between users of the news dissemination. On the one hand, the model mines relevant knowledge concepts from the news content and links them with the knowledge entities in the wiki knowledge graph, and integrates knowledge entities and entity context as auxiliary information. On the other hand, inspired by the “similarity effect” in social psychology, this paper constructs a user interaction network and defines the weighted GCN by calculating the feature similarity among users to analyze the mutual influence of users. Two public datasets, Twitter15 and Twitter16, are selected to evaluate the model, and the accuracy reaches 0.905 and 0.930, respectively. In the comparison experiments, A-KWGCN model has more significant advantages than the other six comparison models in four evaluation indexes. Also, ablation experiments are conducted to verify that knowledge module and weighted GCN module play the significant role in the detection of fake news.
Read full abstract