Semantic text generation is critical in Natural Language Processing (NLP) as it faces challenges such as maintenance of coherence among texts, contextual relevance, and quality output. Traditional language models often produce grammatically inconsistent text. To address these issues, we introduce Convolutional Attention Bi-LSTM with Whale Optimization (CANBLWO), a novel hybrid model that integrates a Convolutional Attention Network (CAN), Bidirectional Long Short-Term Memory (Bi-LSTM), and Whale Optimization Algorithm (WOA). CANBLWO aims to generate semantically rich and coherent text and outperforms the traditional models like Long Short-Term Memory (LSTM), Recurrent Neural Networks (RNN), Bi-LSTM, and Bi-LSTM with attention, Bidirectional Encoder Representations from Transformers (BERT), and Generative Pre-trained Transformer 2 (GPT-2). Our model achieved 0.79, 0.78, 0.76, and 0.82 scores in Metric for Evaluation of Translation with Explicit Ordering (METEOR), Bi-Lingual Evaluation Understudy (BLEU), Consensus-based Image Description Evaluation (Ciders), and Recall-Oriented Understudy for Gisting Evaluation (ROUGE) metrics, respectively. The proposed model also demonstrates 97% and 96% accuracy on Wiki-Bio and Code/Natural Language Challenge (CoNaLa) datasets, highlighting its effectiveness against Large Language Models (LLMs). This study underscores the potential capability of hybrid approaches in enhancing semantic text generation
Read full abstract