Abstract

AbstractSentiment analysis, a method used to classify textual content into positive, negative, or neutral sentiments, is commonly applied to data from social media platforms. Arabic, an official language of the United Nations, presents unique challenges for sentiment analysis due to its complex morphology and dialectal diversity. Compared to English, research on Arabic sentiment analysis is relatively scarce. Transfer learning, which applies the knowledge learned from one domain to another, can address the limitations of training time and computational resources. However, the development of transfer learning for Arabic sentiment analysis is still underdeveloped. In this study, we develop a new hybrid model, RNN‐BiLSTM, which merges recurrent neural networks (RNN) and bidirectional long short‐term memory (BiLSTM) networks. We used Arabic bidirectional encoder representations from transformers (AraBERT), a state‐of‐the‐art Arabic language pre‐trained transformer‐based model, to generate word‐embedding vectors. The RNN‐BiLSTM model integrates the strengths of RNN and BiLSTM, including the ability to learn sequential dependencies and bidirectional context. We trained the RNN‐BiLSTM model on the source domain, specifically the Arabic reviews dataset (ARD). The RNN‐BiLSTM model outperforms the RNN and BiLSTM models with default parameters, achieving an accuracy of 95.75%. We further applied transfer learning to the RNN‐BiLSTM model by fine‐tuning its parameters using random search. We compared the performance of the fine‐tuned RNN‐BiLSTM model with the RNN and BiLSTM models on two target domain datasets: ASTD and Aracust. The results showed that the fine‐tuned RNN‐BiLSTM model is more effective for transfer learning, achieving an accuracy of 95.44% and 96.19% on the ASTD and Aracust datasets, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call