Abstract

Sequence-to-sequence models have shown strong performance in a wide range of NLP tasks, yet their applications to sentence generation from logical representations are underdeveloped. In this paper, we present a sequence-to-sequence model for generating sentences from logical meaning representations based on event semantics. We use a semantic parsing system based on Combinatory Categorial Grammar (CCG) to obtain data annotated with logical formulas. We augment our sequence-to-sequence model with masking for predicates to constrain output sentences. We also propose a novel evaluation method for generation using Recognizing Textual Entailment (RTE). Combining parsing and generation, we test whether or not the output sentence entails the original text and vice versa. Experiments showed that our model outperformed a baseline with respect to both BLEU scores and accuracies in RTE.

Highlights

  • IntroductionSyntactic and semantic parsing has been developed and improved significantly

  • In recent years, syntactic and semantic parsing has been developed and improved significantly

  • One advantage of using logical formulas in semantic parsing is that they have expressive power that goes beyond simple representations such as predicate-argument structures

Read more

Summary

Introduction

Syntactic and semantic parsing has been developed and improved significantly. In combination with the restricted use of higherorder logic (HOL) developed in formal semantics, those logical formulas have recently been used for RTE (Mineshima et al, 2015; Abzianidze, 2015) and Semantic Textual Similarity (STS) (Yanaka et al, 2017) and achieved high accuracy Compared with these recent developments in syntactic and semantic parsing, automatic generation of sentences from expressive logical formulas has received relatively less attention, despite a long and venerable tradition of work on surface realization, including those based on Minimal Recursion Semantics (MRS) (Carroll et al, 1999; Carroll and Oepen, 2005) and CCG (White, 2006; White and Rajkumar, 2009).

Input logical formula
Related Work
Embedding
Sequence-to-Sequence with Attention
Masking
Dataset
Evaluation
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call