Abstract
In the rapidly evolving field of natural language processing, the demand for efficient automated text summarization systems that not only distill extensive documents but also capture their nuanced thematic elements has never been greater. This paper introduces the FuzzyTP-BERT framework, a novel approach in extractive text summarization that synergistically combines Fuzzy Topic Modeling (FuzzyTM) with the advanced capabilities of Bidirectional Encoder Representations from Transformers (BERT). Unlike traditional extractive methods, FuzzyTP-BERT integrates fuzzy logic to refine topic modeling, enhancing the semantic sensitivity of summaries by allowing a more nuanced representation of word-topic relationships. This integration results in summaries that are not only coherent but also thematically rich, addressing a significant gap in current summarization technology. Extensive evaluations on benchmark datasets demonstrate that FuzzyTP-BERT significantly outperforms existing models in terms of ROUGE scores, effectively balancing topical relevance with semantic coherence. Our findings suggest that incorporating fuzzy logic into deep learning frameworks can markedly improve the quality of automated text summaries, potentially benefiting a wide range of applications in the information overload age.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of King Saud University - Computer and Information Sciences
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.