Abstract

In abstractive single-document summarization task, generated summaries always suffer from fabricated and less informative content. An intuitive way to alleviate this problem is to merge external semantic knowledge into the model framework. In this paper, we incorporate explicit graphs based on semantic knowledge, including term frequency, discourse information, and entities with their relations, into neural abstractive summarization for the problem. We propose a novel model for abstractive single-document Summarization based on Semantic Knowledge Graphs (SKGSUM), which regards sentences and entities as nodes, captures the relations between units in different textual levels, and focuses on salient content in the source documents to guide the summary generation process. To the best of our knowledge, we are the first to exploit different textual-unit levels explicit graph representations in a unified framework for the neural abstractive summarization task. Results show that our model achieves significant improvements on both XSum and CNN/Daily Mail datasets over some strong baselines. Human evaluations further indicate that our model can generate informative and coherent summaries.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call