Abstract

Conversational search is the dominant intent of Question Answering, which is achieved through different NLP techniques, Deep Learning models. The advent of Transformer models has been a breakthrough in Natural Language Processing applications, which has attained a benchmark on state of NLP tasks such as question answering. Here we propose a semantic Malayalam Question Answering system that automatically answers the queries related to health issues. The Biomedical Question-Answering, especially in the Malayalam language, is a tedious and challenging task. The proposed model uses a neural network-based Bidirectional Encoder Representation from Transformers (BERT), to implement the question answering system. In this study, we investigate how to train and fine-tune a BERT model for Question-Answering. The system has been tested with our own annotated Malayalam SQUAD form health dataset. In comparing the result with our previous works - Word embedding and RNN based model, identified we find that our BERT model is more accurate than the previous models and achieves an F1 score of 86%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call