Abstract

Multi-lingual question generation (QG) is the task of generating natural language questions for single answer passage in any given language. In this paper, we design a system for supporting multi-lingual QG in the "People Also Ask" (PAA) module for Bing. For zero shot setting, the primary challenge is to transfer the knowledge from trained QG model in the pivot language to other languages without further addition of training data in these languages. Compared to other zero-shot tasks, the differentiating and challenging aspect in QG is to preserve the question structure so that the resulting output is interrogative. Existing models for similar tasks tend to generate natural language queries or copy sub-span of the passage, failing to preserve the question structure. In our work, we demonstrate how knowledge transfer in multi-lingual IQG (Interrogative QG) can be significantly improved using auxiliary tasks either in multi-task or pre-training task setting. We explore two kinds of tasks - cross-lingual translation and multi-lingual denoising auto-encoding of questions, especially when using translate-train. Using data for 13 languages from Bing PAA as well as online A/B tests, we show that both of these tasks significantly improve the quality of zero-shot IQG on non-trained languages.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call