Abstract

In abstractive single-document summarization task, generated summaries always suffer from fabricated and less informative content. An intuitive way to alleviate this problem is to merge external semantic knowledge into the model framework. In this paper, we incorporate explicit graphs based on semantic knowledge, including term frequency, discourse information, and entities with their relations, into neural abstractive summarization for the problem. We propose a novel model for abstractive single-document Summarization based on Semantic Knowledge Graphs (SKGSUM), which regards sentences and entities as nodes, captures the relations between units in different textual levels, and focuses on salient content in the source documents to guide the summary generation process. To the best of our knowledge, we are the first to exploit different textual-unit levels explicit graph representations in a unified framework for the neural abstractive summarization task. Results show that our model achieves significant improvements on both XSum and CNN/Daily Mail datasets over some strong baselines. Human evaluations further indicate that our model can generate informative and coherent summaries.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.