Abstract

Automatic highlighting from texts is an abstractive summarization problem that is frequently focused on in natural language processing. In encoder-decoder architectures, developed for abstractive summarization, as the size of the input array increases, the learning ability of the architecture becomes difficult. To solve this problem, the focus is on minimizing this disadvantage of encoder – decoder architectures by using the Attention mechanism. In this study, we used an LSTM encoder – decoder with an attention mechanism to perform the highlight abstraction process. In addition, we used an extractive summarization step as a preprocess to increase the learning ability of the encoder – decoder architecture and reduce the input text size. We preferred the PageRank method in the extractive summarization process here. In the PageRank method, sentence vectors were extracted by using Glove embeddings to calculate similarities of text sentences. The proposed approach achived the extractive summarization by 67.6% and abstractive summarization by 59.6% in ROUGE-1 score.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.