Abstract

AbstractQuestion and answering system and text generation using the BERT and GPT-2 transformer is a specialized field of the information retrieval system, which a query is stated to system and relocates the correct or closet answer to a specific query asked by the natural language. The main aim of the QA system is to provide the short answer to a question rather than the list of possible relevant document and text generation is a language generation which focuses on the producing understandable test in English which can predict the next sentence or generate a text with all the raw content from previous words. The motivation for selecting this work is to provide a great relevance to find the answer, find answer to general knowledge type of question, find the answers for questions like Who? What? Where? How?, and Provide provide the shortest form of answer. The scope for the chosen work is to provide the solutions for the automation moderation in the websites to provide the exact and short information answers from the websites, like Stack Overflow, Reddit, Quora, provide the self-answering and find text. The method we are using for the QA and text generation system is a transformer architecture which consist of Encoder and Decoder which is a stack of encoder represents the BERT model and Decoder part is represented as the GPT-2 model.KeywordsText generationBERT modelGPT-2 modelQuestion and answeringTransformerQuestionNatural language processingEncoderDecoder

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call