Abstract

Machine reading comprehension (MRC) has always been a significant part of artificial intelligence and the focus in the field of natural language processing (NLP). Given context paragraph, to answer its query, we need to encode complex interaction between the question and the context. In the late years, with the rapid progress of neural network model and attention theory, MRC has made great advances. Especially, attention theory has been widely used in MRC. However, the accuracy of the previous classic baseline model has some upside potential and some of them did not take into account the long context dependence and polysemy. In this paper, for resolving the above problems and further improve the model, we introduce ELMo representations and add a gated self-attention layer to the Bi-Directional Attention Flow network (BIDAF). In addition, we employ the feature reuse method and modify the linear function of answer layer to further improve the performance. In the experiment of SQuAD, we prove this model greatly exceeds the baseline BIDAF model and its performance is close to the average level of human test, which proves the validity of this model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call