Abstract
AbstractQuestion-Answering (QA) has become one of the most popular natural language processing (NLP) and information retrieval applications. To be applied in QA systems, this paper presents a question classification technique based on NLP and Bidirectional Encoder Representation from Transformers (BERT). We performed experimental investigation on BERT for question classification with TREC-6 dataset and a Thai sentence dataset. We propose an improved processing technique called “More Than Words – BERT” (MTW – BERT) that is a special NLP Annotation tags for combining Part-Of-Speech tagging and Named Entities Recognition to be able for learning both pattern of grammatical tag sequence and recognized entities together as input before classifying text on BERT model. Experimental results showed that MTW – BERT outperformed existing classification methods and achieved new state-of-the-art performance on question classification for TREC-6 dataset with 99.20%. In addition, MTW-BERT also applied for question classification for Thai sentences in wh-question category. The proposed technique remarkably achieved Thai wh-classification with accuracy rate of 87.50%.KeywordsClassificationBERT-based modelNLP TaggingAnalysis Thai Sentence
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.