Abstract

The task of causal question answering aims to reason about causes and effects over a provided real or hypothetical premise. Recent approaches have converged on using transformer-based language models to solve question answering tasks. However, pretrained language models often struggle when external knowledge is not present in the premise or when additional context is required to answer the question. To the best of our knowledge, no prior work has explored the efficacy of augmenting pretrained language models with external causal knowledge for multiple-choice causal question answering. In this paper, we present novel strategies for the representation of causal knowledge. Our empirical results demonstrate the efficacy of augmenting pretrained models with external causal knowledge. We show improved performance on the COPA (Choice of Plausible Alternatives) and WIQA (What If Reasoning Over Procedural Text) benchmark tasks. On the WIQA benchmark, our approach is competitive with the state-of-the-art and exceeds it within the evaluation subcategories of In-Paragraph and Out-of-Paragraph perturbations.

Highlights

  • The term causal knowledge has a long historyRecent model-based approaches for question an- rooted in philosophy, psychology, and many other swering tasks have primarily focused on finetun- academic disciplines (Goldman, 1967)

  • In this paing pretrained transformer-based language models, per, we will refer to causal facts and causal knowlsuch as BERT (Devlin et al.) and RoBERTa

  • Consider the following example from COPA: Our experiments demonstrate that augmenting pretrained models with external causal knowledge improves results over the baseline on the COPA and WIQA benchmark tasks

Read more

Summary

Introduction

Recent model-based approaches for question an- rooted in philosophy, psychology, and many other swering tasks have primarily focused on finetun- academic disciplines (Goldman, 1967). Tivities such as coal-burning power plants would such as causal reasoning, pretrained language mod- yield the causal fact factories cause global warmels are often limited as they lack the specific exter- ing. These causal facts can be described explicnal background knowledge required to effectively itly in a knowledge base or expressed formally as reason about causality. We present a novel end-to-end neural architecture that augments RoBERTa with external causal knowledge for multiple-choice question answering Such as (ash clouds, cause-effect, environmental disturbances), the model could make the causal association and logical leap that the magnitude of the effect is more.

Related Work
Causal Knowledge Representation
Distributed Causal Embeddings
Methodology
CausalKGE
Experimental Settings
CausalSkipgram
Input Augmentation
Results
Broader Impact
A Appendix
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call