Abstract

This paper presents a question generation system based on the approach of semantic rewriting. The state-of-the-art deep linguistic parsing and generation tools are employed to convert (back and forth) between the natural language sentences and their meaning representations in the form of Minimal Recursion Semantics (MRS). By carefully operating on the semantic structures, we show a principled way of generating questions without ad-hoc manipulation of the syntactic structures. Based on the (partial) understanding of the sentence meaning, the system generates questions which are semantically grounded and purposeful. And with the support of deep linguistic grammars, the grammaticality of the generation results is warranted. Further, with a specialized ranking model, the linguistic realizations from the general purpose generation model are further refined for our the question generation task. The evaluation results from QGSTEC2010 show promising prospects of the proposed approach.

Highlights

  • Question Generation (QG) is the task of generating reasonable questions from an input, which can be structured or unstructured

  • We narrow the task of QG down to taking a natural language text as input, as it is a more interesting challenge that involves a joint effort between Natural Language Understanding (NLU) and Natural Language Generation (NLG)

  • Minimal Recursion Semantics is a theory of semantic representation of natural language sentences with a focus on the underspecification of scope ambiguities. pet is used as a parser to interpret a natural language sentence into mrs with the guidance of a hand-written grammar

Read more

Summary

Introduction

Question Generation (QG) is the task of generating reasonable questions from an input, which can be structured (e.g. a database) or unstructured (e.g. a text). We narrow the task of QG down to taking a natural language text as input ( textual QG), as it is a more interesting challenge that involves a joint effort between Natural Language Understanding (NLU) and Natural Language Generation (NLG). The proposed system consists of multiple syntactic and semantic processing components based on the delph-in tool-chain (e.g. erg/lkb/pet), while the theoretical support comes from Minimal Recursion Semantics. The generation component of lkb takes in an mrs structure and produces various realizations as natural language sentences. Both directions of processing are guided by the English Resource Grammar, which includes a large scale hand-crafted lexicon and sophisticated grammar rules that cover most essential syntactic constructions of English, and connects natural language sentences with their meaning representations in mrs. The tools are developed in the context of the delph-in collaboration, and are freely available as an open source repository

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call