Abstract

Question generation, the task of automatically creating questions that can be answered by a certain span of text within a given passage, is important for question-answering and conversational systems in digital assistants such as Alexa, Cortana, Google Assistant and Siri. Recent sequence to sequence neural models have outperformed previous rule-based systems. Existing models mainly focused on using one or two sentences as the input. Long text has posed challenges for sequence to sequence neural models in question generation – worse performances were reported if using the whole paragraph (with multiple sentences) as the input. In reality, however, it often requires the whole paragraph as context in order to generate high quality questions. In this paper, we propose a maxout pointer mechanism with gated self-attention encoder to address the challenges of processing long text inputs for question generation. With sentence-level inputs, our model outperforms previous approaches with either sentence-level or paragraph-level inputs. Furthermore, our model can effectively utilize paragraphs as inputs, pushing the state-of-the-art result from 13.9 to 16.3 (BLEU_4).

Highlights

  • Question generation (QG), aiming at creating questions from natural language text, e.g. a sentence or paragraph, is an important area in natural language processing (NLP)

  • A conversational system can be proactive by asking the user questions (Shum et al, 2018), while a QnA system can benefit from a large scale question-answering corpus which can be created by an automated QG system (Duan et al, 2017)

  • In NLP, QG has been mainly tackled by two approaches: 1) rule-based approach, e.g. (Heilman and Smith, 2010; Mazidi and Nielsen, 2014; Labutov et al, 2015) 2) neural QG approach: end-toend training a neural network using the sequence to sequence framework, e.g. (Du et al, 2017; Yuan et al, 2017; Song et al, 2017; Zhou et al, 2017)

Read more

Summary

Introduction

Question generation (QG), aiming at creating questions from natural language text, e.g. a sentence or paragraph, is an important area in natural language processing (NLP). A conversational system can be proactive by asking the user questions (Shum et al, 2018), while a QnA system can benefit from a large scale question-answering corpus which can be created by an automated QG system (Duan et al, 2017). Education is another key application where QG can help with reading comprehension (Heilman and Smith, 2010).

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.