Abstract

Machine reading comprehension (MRC) is a crucial and challenging task in natural language processing (NLP). With the development of deep learning, language models have achieved excellent results. However, these models still cannot answer complex questions. Currently, researchers often utilize structured knowledge, such as knowledge bases (KBs), as external knowledge by directly extracting triples to enhance the results of machine reading. Although they can support certain background knowledge, the triples are limited to the interrelationships among entities or words. Unlike structured knowledge, unstructured knowledge is rich and extensive. However, these methods ignore unstructured knowledge resources, such as Wikipedia. In addition, the effect of combining the two types of knowledge is still not known. In this study, we first attempt to explore the usefulness of combining them. We introduce a fusion mechanism into a rich knowledge fusion layer (RKF) to obtain more useful and relevant knowledge from different external knowledge resources. Further to promote interaction among different types of knowledge, a bi-matching layer is added. We propose the RKF-NET framework based on BERT, and our experimental results demonstrate the effectiveness of two classic datasets: SQuAD1.1 and the Easy-Challenge (ARC).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call