Abstract

The manner in which a question is phrased determines the quality and type of response that is obtainable. An answer is likely to change when a question is rephrased. Thus, the need for well-formed questions indeed facilitates quality communication and information exchange in natural language. Moreso, Automated Question Generation Models (AQGM) greatly reduce human errors and any grammatical deficiencies that may arise in the formulation of relevant questions. Existing AQGM methods employ muddled structures that necessitate complex computational assets and increase the ambiguity of the task. In this paper we proposed an automated Ensemble Question Generation Model using the Transfer Learning (EQGTL) framework which utilizes a transfer learning approach to tune Bert Disambiguation Bert-WSD) Model and Text-to-Text Transfer Transformer (T5) model. For our use case, we relied on the robustness of the pre-trained T5 model and finetuned it for generating questions and correct answers alongside multiple incorrect answers described as distractors. When tested with an unseen set of free-form text, our model performs admirably well, generating relevant questions and answers that are grammatically correct and contextually relevant. Tutees can use questions generated by our model to assess their level of comprehension, and instructors as well can use these questions to quickly build key ideas at any time. Our proposed model serves as a valuable solution that can be applied to multiple domains and subject areas.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call