Abstract

The combination of best suited architecture and successful algorithm results in the increased nature of efficient learning among the end users. To increase the number of quality learner’s text summarization provides the best initiative among the readers and learners. As words and sentences comprise a document, document summarization finds diverse words with different sets of synonyms by performing training activity for the process. The S2S(Sequence to Sequence) training mechanism describes the embedding way of sentences and documents. The pointer generation enhances the new hybrid model for summary extraction. The proposed model implements attention mechanism and uses Recurrent Neural Network with LSTM cells at encoder and decoder. The working model focuses on many factors for summary extraction such as sentence/document similarity, repeatedness, indexing and sentence-context richness. It also keeps track of summarized text using coverage to avoid repetition.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call