Abstract

Towards tackling the phenomenon of textual information overload that is exponentially pumping with redundancy over the Internet, this paper investigates a solution depending on the Automatic Text Summarization (ATS) method. The idea of ATS is to assist, e.g., online readers, in getting a simplified version of texts for preserving their time/effort required to skim a given large body of text. However, ATS is deemed as one of the most complex NLP applications, particularly for the Arabic language that has not been intelligently developed like the other Indo-European languages. Thus, we present an extractive-based summarizer (ArDBertSum) for text written in Arabic, relying on the DistilBERT model. Besides, we propose a domain-specific sentence-clauses segmentater (SCSAR) to support our ArDBertSum in further shortening long/complex sentences. The results of our experiments illustrate that our ArDBertSum yields the best performance, compared with non-heuristic Arabic summarizers, in producing an acceptable quality of candidate summaries. These experiments have been conducted on EASC-dataset (along with our proposed dataset) to report on (1) a statistical evaluation utilizing ROUGE metrics and (2) a specific human-based evaluation. The human evaluation results revealed promising perceptions; however, further works are needed to ameliorate the coherence and punctuation of the automatic summaries.

Highlights

  • In this era of digitalization, tremendous amounts of textual data and electronic documents are exponentially pumping and advance to diffuse over the Internet rapidly

  • Automatic Text Summarization is deemed as one of the most complex Natural Language Processing (NLP) applications, for the Arabic language that has not been intelligently developed like the other Indo-European languages

  • Towards producing a summarizer for text written in Arabic, relying on a pre-trained Language Understanding model (LUM), this paper has examined the ability of a fine-tuned version of DistilBERT in addressing the Arabic Automatic Text Summarization (ATS) concluded with offering a summarizer (ArDBertSum)

Read more

Summary

INTRODUCTION

In this era of digitalization, tremendous amounts of textual data and electronic documents are exponentially pumping and advance to diffuse over the Internet rapidly. Abstractive | Extractive | Hybrid Monolingual | Multilingual Domain-specific | Generic | Query-based Driven Single-document | Multi-document Indicative | Informative in details, the abstractive summarization method aims to construct new sentences (sometimes with paraphrasing technique [21]) to produce a candidate summary, relying on understanding the observed input texts, see [22]. [13] design an Arabic text summarizer that focuses on reducing the redundancy and noisy data in a given input multi-document Their underlying technique is implemented using an unsupervised score-based method. Investigate the overall utility of DistilBert model in Arabic summarization using some statistical ROUGE metrics This investigation would implicitly estimate the efficiency of the input intermediate-representation technique based on DistilBert for the Arabic texts (i.e., depends on the word-embedding method) as well as the accuracy of sentence tokenization and scoring

ArDBertSum
EXPERIMENTS AND KEY FINDINGS
17 Tourisms
PERFORMANCE OF ARDBERTSUM WITH OPTIMIZING THE INITIAL DistilBERT SUMMARY
PERFORMANCE COMPARISON WITH THE RELATED EXTRACTIVE APPROACHES
HUMAN EVALUATION
Literature
Findings
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.