Abstract

Knowledge base completion has been an active research topic for knowledge graph. However, existing methods are of low learning and generalization abilities, and neglect the rich internal logic between entities and relationships. To solve the above problems, this paper proposes the modeling of complex internal logic for knowledge base completion. This method first integrates the semantic information into the knowledge representation model and strengthens the credibility scores of the positive and negative triples with the semantic gap, which not only makes the model converge faster, but also can obtain the knowledge representation of the fusion semantic information; and then we put forward the concept of knowledge subgraph, through the memory network and multi-hop attention mechanism, the knowledge information in the knowledge subgraph and the to-be-complemented triple are merged. In the process of model training, we have different training methods from the classical memory network, and added reinforcement learning. The reciprocal of the correct reasoning knowledge information in the model output is used as the reward value, and the final training model complements the triple information. The high computing capability of knowledge representation, the high learning and generalization abilities of the memory network and the multi-hop attention mechanism are also utilized in the method. The experimental results on data sets FB15k and WN18 show that the present method performs well in knowledge base completion and can effectively improve Hits@10 and MRR values. We also verified the practicability of the proposed method in the recommendation system and question answering system base on knowledge base, and have achieved good results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call