Abstract

AbstractText-based logical reasoning requires the model to understand the semantics of input text, and then understand the complex logical relationships within the text. Previous works equip pre-trained language models with the logical reasoning ability by training these models on datasets obtained by logical-driven text extension. However, these methods only generate instances based on logical expressions entailed within the input text. And we argue that external commonsense knowledge is still necessary for restoring the complete reasoning chains for generating more reasonable and abundant instances. To address this issue, in this paper, we propose CSKE, a commonsense knowledge enhanced text extension framework. CSKE incorporates abundant commonsense from an external knowledge base to restore the potentially missing logical expressions and encodes more logical relationships to then extend them through logical equivalence laws. Experiments on the benchmark datasets show that our method can improve the performance of logical reasoning, especially on the instances containing complex logical relationships.KeywordsLogical expressionLogical reasoningCommonsense knowledge

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call