Abstract

Knowledge graph embedding, representing entities and relations in the knowledge graphs with high-dimensional vectors, has made significant progress in link prediction. More researchers have explored the representational capabilities of models in recent years. That is, they investigate better representational models to fit symmetry/antisymmetry and combination relationships. The current embedding models are more inclined to utilize the identical vector for the same entity in various triples to measure the matching performance. The observation that measuring the rationality of specific triples means comparing the matching degree of the specific attributes associated with the relations is well-known. Inspired by this fact, this paper designs Semantic Filter Based on Relations(SFBR) to extract the required attributes of the entities. Then the rationality of triples is compared under these extracted attributes through the traditional embedding models. The semantic filter module can be added to most geometric and tensor decomposition models with minimal additional memory. experiments on the benchmark datasets show that the semantic filter based on relations can suppress the impact of other attribute dimensions and improve link prediction performance. The tensor decomposition models with SFBR have achieved state-of-the-art.

Highlights

  • Knowledge Graphs (KGs) are collections of largescale triples, such as Freebase(Bordes et al, 2013), YAGO (Suchanek et al, 2008) and DBpedia(Auer et al, 2019), a method based on vector space rotation, and HAKE(Zhang et al, 2020a)

  • We evaluate the performance of link prediction in the filtered setting (Bordes et al, 2013), i.e., all known true triples are removed from the candidate set except for the current test triple

  • Compared with lect a pair of samples to analyze the function of TransE, TransE-SFBR has significant improve- SFBR and show the additional resources occupied ments: on WN18RR, Hit@10 increases by 3.8%; by SFBR

Read more

Summary

A Semantic Filter Based on Relations for Knowledge Graph Completion

Knowledge graph embedding, representing entities and relations in the knowledge graphs with high-dimensional vectors, has made significant progress in link prediction. The observation that measuring the rationality of specific triples means comparing the matching degree of the specific attributes associated with the relations is well-known Inspired by this fact, this paper designs Semantic Filter Based on Relations(SFBR) to extract the required attributes of the entities. Knowledge graph embedding models map entities and relations into low-dimensional vectors (or matrices, tensors), measure the rationality of triples through specific functions between entities and relations, and rank the triples with function scores. TransD(Ji et al, 2015) tries to incorporate the different representations of the entities under the entity and relation into the calculation These variants attempt to perform complex transformations based on relations or triples to achieve different representations of entities in different semantic spaces. Scholars are more inclined to solve link prediction by designing models with more powerful representation, such as ComplEx(Trouillon et al, 2016), Tucker(Balazevic et al, 2019), RotatE(Sun

Introduction
Background
SFBR model
Special Cases with SFBR
Conclusion
A Analysis of generality for MLP-based SFBR
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.