Abstract

Facilitating the sharing of information between the two complementary tasks of Natural Language Understanding(NLU) and Natural Language Generation(NLG) is crucial to the study of Natural Language Processing(NLP). NLU extracts the core semantics from a given utterance, while NLG, in contrast, aims to construct the corresponding sentence based on the given semantics. However, model training for both research topics relies on manually annotated data, but the complexity of the annotation process involved makes it costly to acquire manually annotated data on a large scale. Also, in the existing research, few scholars have treated NLU and NLG as dual tasks. Indeed, both NLG and NLU can be approached as translation problems: NLU translates natural language into formal representations, while NLG converts formal representations into natural language. In this paper, we propose a Transformer-based Natural Language Understanding and Generation (T-NLU&G) model that jointly model NLU and NLG by introducing a shared latent variable. The model can help us explore the intrinsic connection between the natural language space and the formal representation space, and use this latent variable to facilitate information sharing between the two spaces. Experiment shows that our model achieves performance gains on both the E2E dataset and the Weather dataset, validates the feasibility and effectiveness of performance gains for the respective tasks via the T-NLU&G model, and is competitive with current state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call