Abstract

Interpretable multi-hop question answering requires step-by-step reasoning over multiple documents and finding scattered supporting facts to answer the question. Prior works have proposed the entity graph method to aggregate the entity information to improve the ability of reasoning. However, the entity graph loses some non-entity information that is also important to understand the semantics. Moreover, the entities distributed in the noisy sentences may mislead the reasoning process. In this paper, we propose the Coarse and Fine Granularity Graph Network (CFGGN), a novel interpretable model that combines both sentence information and entity information to answer the multi-hop questions. The CFGGN consists of a coarse-grain module to perform sentence-level reasoning and a fine-grain module to make entity-level reference. In sentence-level reasoning, the sentence graph is constructed to filter out the noisy sentences and capture the sentence features. In entity-level reference, a dynamic entity graph is used for the entity-level reasoning. We design a fusion module to integrate information of different granularity. To enhance the interpretability of the overall process, we calculate the reasoning score for each step and present the reasoning path from the multiple documents to the final answer. Evaluation on the HotpotQA dataset in the distractor setting shows that our method outperforms the published SOTA entity-based method in five out of six metrics.

Highlights

  • The task of question answering (QA) requires the machine to answer a natural language question by reading the given context

  • We propose the Coarse and Fine Granularity Graph Network (CFGGN) for multihop question answering over multi-documents

  • We evaluate our model on HotpotQA [5], a recently released multi-hop reading comprehension dataset

Read more

Summary

INTRODUCTION

The task of question answering (QA) requires the machine to answer a natural language question by reading the given context. Multi-documents QA are more challenging because of the larger and noisier context All these questions can be answered by one hop because evidence exists only in one sentence. To further evaluate the machine’s ability of understanding and reasoning, several multi-hop question answering datasets ([4], [5]) have been proposed. Given the question and 10 paragraphs, machines are expected to return the supporting facts (purple highlighted sentences in Figure 1) and the golden answer (‘‘YG Entertainment’’). To this end, machines need to reason step by step across these paragraphs. Existing research recognizes the critical role played by Graph Neural Networks (GNN) in realizing the multihop reasoning over multiple documents. Present the interpretable path from the multiple documents to the final answer

RELATED WORK
PARAGRAPH FILTER
SENTENCE GRAPH
ENTITY GRAPH
FUSION
MODEL INTERPRETATION
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call