Abstract

Natural Language Processing (NLP) has witnessed considerable advances in textual understanding through statistical and rule-based techniques, which have now yielded to developments in the field of neural networks and deep learning. The paper surveys research relevant to the topic of Reading Comprehension and Question Answering (QA) implementations. The initial focus of the paper is on Attention and Transformer models. A brief description of the architectures is presented highlighting the essence of the ‘Attention is all you Need’ paper wherein the authors of Bidirectional Encoding Transformer (BERT) have elucidated the significant departure from the recurrence concept of Recurrent Neural Networks (RNN). Subsequently the trends in Open Domain Question Answering (ODQA) which mark the progression from the passage- based question answering is presented. Of particular interest is Haystack which is an end-to-end open-source framework for Question Answering & Neural search. This field seems a promising avenue for a more intelligent form of ‘search’. In a nutshell, the paper weaves through RNNs, Long Short-Term Memory (LSTM), and the currently trending Attention based Transformer models in NLP. Finally, we dwell on more contemporary pieces of research such as ODQA, Multi-Hop QA, evaluation using Adversarial Networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call