Abstract

Graph neural networks have been proven to be very effective for representation learning of knowledge graphs. Recent methods such as SACN and CompGCN, have achieved the most advanced results in knowledge graph completion. However, previous efforts mostly rely on localized first-order approximations of spectral graph convolutions or first-order neighborhoods, ignoring the abundant local structures like cycles and stars. Therefore, the diverse semantic information beneath these structures is not well-captured, leaving opportunities for better knowledge representation which will finally help KGC. In this work, we propose LSA-GAT, a graph attention network with a novel neighborhood aggregation strategy for knowledge graph completion. The model can take special local structures into account, and derive a sophisticated representation covering both the semantic and structural information. Moreover, the LSA-GAT model is combined with a CNN-based decoder to form an encoder-decoder framework with a carefully designed training process. The experimental results show significant improvement of the proposed LSA-GAT compared to current state-of-the-art methods on FB15k-237 and WN18RR datasets.

Highlights

  • Knowledge Graphs (KGs), such as Freebase [1], YAGO [2], DBpedia [3] and NELL [4] are fashionable carriers for various common-sense knowledge

  • For every format, the correct entity is ranked among all KG entities excluding the set of other true entities for the triples observed in train and valid sets

  • We report the Mean Rank (MR), the Mean Reciprocal Rank (MRR) of the correct entity which is the average of the reciprocal rank of the correct entity, and the Hits@N which is the accuracy in the top N predictions

Read more

Summary

Introduction

Knowledge Graphs (KGs), such as Freebase [1], YAGO [2], DBpedia [3] and NELL [4] are fashionable carriers for various common-sense knowledge. They act as the core of many stateof-the-art natural language processing solutions to many practical applications, including question answering, reading comprehension, etc.

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call