Abstract

Knowledge bases become essential resources for many data mining and information retrieval tasks, but they remain far from complete. Knowledge base completion has attracted extensive research efforts from researchers and prac-titioners in diverse areas, which aims to infer missing facts from existing ones in a knowledge base. Quantities of knowledge base completion methods have been developed by regarding each relation as a translation from head entity to tail entity. However, existing methods merely concentrate on fact triples in the knowledge base or co-occurrence of words in the text, while supplementary semantic information expressed via related entities in the text has not been fully exploited. Meanwhile, the representation ability of current methods encounters bottlenecks due to the structure sparseness of knowledge base. In this paper, we propose a novel knowledge base representation learning method by taking advantage of the rich semantic information expressed via related entities in the textual corpus to expand the semantic structure of knowledge base. In this way, our model can break through the limitation of structure sparseness and promote the performance of knowledge base completion. Extensive experiments on two real-world datasets show that the proposed method successfully addresses the above issues and significantly outperforms the state-of-the-art methods on the benchmark task of link prediction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.