Abstract

Automatic highlighting from texts is an abstractive summarization problem that is frequently focused on in natural language processing. In encoder-decoder architectures, developed for abstractive summarization, as the size of the input array increases, the learning ability of the architecture becomes difficult. To solve this problem, the focus is on minimizing this disadvantage of encoder – decoder architectures by using the Attention mechanism. In this study, we used an LSTM encoder – decoder with an attention mechanism to perform the highlight abstraction process. In addition, we used an extractive summarization step as a preprocess to increase the learning ability of the encoder – decoder architecture and reduce the input text size. We preferred the PageRank method in the extractive summarization process here. In the PageRank method, sentence vectors were extracted by using Glove embeddings to calculate similarities of text sentences. The proposed approach achived the extractive summarization by 67.6% and abstractive summarization by 59.6% in ROUGE-1 score.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call