Abstract

Knowledge Graph Reasoning (KGR) is an effective method to ameliorate incompleteness and sparsity problems, which infers new knowledge based on existing knowledge. The Graph Neural Network (GNN)-based approaches can obtain advanced effectiveness. However, it still suffers from some problems such as obtaining insufficient graph features, introducing noises, ignoring path connectivity, and acquiring incomplete neighborhood information. This paper proposes a neighborhood-aware (NA) graph self-attention mechanism-based pre-training model for KGR, namely NA-KGR. The proposed model is composed of two phases. The first phase is an enhanced graph attention network, which can use the weighted characteristics of its neighbors to represent and encode the entities that are most likely to have a positive effect on reasoning. The second phase is a neighborhood-aware self-attention mechanism, which makes the model more able to obtain information from the neighbor entities for inference by increasing an adaptive entity similarity matrix when calculating the attention score. Moreover, we propose a pre-training pattern based on neighborhood-aware random walk sampling and general subgraph structure sampling to improve NA-KGR's generalization ability. Extensive comparison and ablation experimental results on various benchmarks unambiguously demonstrate that the proposed NA-KGR model can obtain the state-of-the-art results of current GNN-based methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call