Abstract

Question generation is an important task in natural language processing that involves generating questions from a given text. This paper proposes a novel approach for dynamic question generation using a context-aware auto-encoded graph neural model. Our approach involves constructing a graph representation of the input text, where each node in the graph corresponds to a word or phrase in the text, and the edges represent the relationships between them. We then use an auto-encoder model to learn a compressed representation of the graph that captures the most important information in the input text. Finally, we use the compressed graph representation to generate questions by dynamically selecting nodes and edges based on their relevance to the context of the input text. We evaluate our approach on four benchmark datasets (SQuAD, Natural Questions, TriviaQA, and QuAC) and demonstrate that it outperforms existing state-of-the-art methods for dynamic question generation. In the experimentation, to evaluate the result four performance metrics are used i.e. BLEU, ROUGE, F1-Score, and Accuracy. The result of the proposed approach yields an accuracy of 92% on the SQuAD dataset, 89% with QuAC, and 84% with TriviaQA. while on the natural questions dataset, the model gives 79% accuracy. Our results suggest that the use of graph neural networks and auto-encoder models can significantly improve the accuracy and effectiveness of question generation in NLP. Further research in this area can lead to even more sophisticated models that can generate questions that are even more contextually relevant and natural-sounding.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call