Abstract

KG-augmented models usually endow existing models with external knowledge graphs, which achieve promising performance in various knowledge-intensive tasks, such as commonsense reasoning. Existing methods mainly first exploited heuristic ways for retrieving the relevant knowledge subgraphs according to the input, and then utilized some effective encoders, such as GNNs, to encode the symbolic knowledge into the neural reasoning networks. However, whether the whole retrieved knowledge subgraphs are really relevant or useful for the reasoning process was seldom considered. Actually, according to our observations and analysis, most retrieved knowledge is noisy and useless to the reasoning models, which would hurt the final performance. To remedy this, this paper proposes information bottleneck based knowledge selection (IBKS), which is able to select useful knowledge from the retrieved knowledge subgraph. Expectedly, the selected knowledge could better improve the commonsense reasoning ability of the model. Moreover, IBKS is model-agnostic and could be plugged into any existing KG-augmented model. Extensive experimental results show that IBKS could effectively improve commonsense reasoning performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call