Abstract

Recently, transformer models have gained interest with their varied range of natural language applications in several domains like education, healthcare, entertainment, and the like. Design of inquisitive bots which ask questions on a given topic of interest is one such application. To improve the quality of questions generated by such a bot while leveraging the power of transformer models, it is necessary to add a component to the pipeline that also considers the topic of discussion. This research examines transformer models pre-trained on natural language with the addition of topic context and subsequent optimization using genetic algorithms to generate more relevant questions. In our hybrid framework, we first use topic modeling for extracting context from the conversation history and then use these topics as input in the conversational training pipeline for generating questions on the given category. We attempt to fine-tune the framework to find the optimized weights using genetic algorithm. Our experiments are carried out using a large-scale dataset that is readily available to the public for the purpose of conversational question generation. We further evaluate the trained models on a dataset of Java passages performing qualitative analysis. Our analysis suggests that utilizing the genetic algorithm-optimized topic aware mechanism improves the quality of generated questions substantially in the conversational set-up.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call