Abstract

Relation prediction for knowledge graphs aims at predicting missing relationships between entities. Inductive relation prediction methods have received increasing attention because of their capability of handling unseen entities. Among the inductive methods, the subgraph-based algorithms have emerged, which inductively predict relations using the subgraph surrounding the target entities. Despite the effectiveness, prior subgraph-based studies rarely focus on the explainability of subgraph reasoning, with which is critical for humans to understand and trust the prediction from GNNs. One of the reasons for the weak explainability of subgraph-based methods is the existence of noisy nodes and edges, which also hinders having higher performance of the model. In this paper, we present a dynamic graph dropout algorithm that prunes irrelevant nodes and edges to find the minimum sufficient subgraph for relation prediction. By this means, the proposed algorithm provides an explanation for which parts of the subgraph the model derives its results from. Specifically, we design an estimation function to evaluate the importance of the nodes and edges. Two dropout functions, i.e., soft and hard dropouts, are elaborately designed to filter out noisy nodes and edges based on their importance. Moreover, multiple restriction losses, including topological loss and penalty loss, are proposed to regularize the generation of the pruned subgraph. In this way, our model seeks to preserve the topology information in the subgraph and meanwhile maximally eliminate the redundant information. By removing noisy information, the proposed algorithm outperforms state-of-the-art models on eight inductive datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call