Abstract

Knowledge graph (KG) is playing an important role in many artificial intelligence applications. Representation learning of KGs aims to project both entities and relations into a continuous low-dimensional space. The representation learning technique based on embedding has been used to implement the KG completion, which aims to predict potential triples (head, relation, and tail) in KG. Most current methods concentrate on learning representations based on triple information while ignoring integrating the textual knowledge and network topology of KG. This leads to ambiguous completions. To address this problem and implement more accurate KG completion, we propose a new representation learning model, TDN model, which integratedly embeds the information of triples, text descriptions, and network structure of KG in a low-dimensional vector space. The framework of TDN is defined and the methodology of implementing TDN embedding is explored. To verify the effectiveness of the proposed model, we evaluate TDN via the experiments of link prediction on the real-world datasets. The experimental results confirm the above claims and show that TDN-based embedding significantly outperforms other baselines.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.