Abstract

Domain adaptation on text summarization task is always challenging, which is caused by the lack of annotated data in the target domain. Previous methodologies focused more on introducing knowledge in the target domain and shifted the model to the target domain. However, they mostly studied the adaptation to a single low-resource domain, which restricted practicality. In this paper, we propose MTL-DAS, a unified model for multidomain adaptive text summarization, which stands for Multitask Learning for Multidomain Adaptation Summarization model. Combined with BART, we investigate multitask learning method to enhance the generalization ability in multidomain. We adapt the ability of detect summary-worthy content from source domain and obtain the knowledge and generation style in target domains by text reconstruction task and text classification task. We carry out the domain adaptation ability experiment on AdaptSum dataset, which includes six domains in low-resource scenarios. The experiment shows the unified model not only outperforms separately trained models, but also is time-consuming and requires less computational resources.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call