Abstract

Knowledge base (KB) completion predicts new facts in a KB by performing inference from the existing facts, which is very important for expanding KBs. Most previous KB completion approaches infer new facts only from the relational facts (facts containing object properties) in KBs. Actually, there are large number of literal facts (facts containing datatype properties) besides the relational ones in most KBs; these literal facts are ignored in the previous approaches. This paper studies how to take the literal facts into account when making inference, aiming to further improve the performance of KB completion. We propose a new approach that consumes both relational and literal facts to predict new facts. Our approach extracts literal features from literal facts, and incorporates them with path-based features extracted from relational facts; a predictive model is then trained on all the features to infer new facts. Experiments on YAGO KB show that our approach outperforms the compared approaches that only take relational facts as input.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call