Abstract

<span lang="EN-US">In the realm of natural language processing (NLP), a diverse array of language models has emerged, catering to a wide spectrum of tasks, ranging from speaker recognition and auto-correction to sentiment analysis and stock prediction. The significance of language models in enabling the execution of these NLP tasks cannot be overstated. This study proposes an approach to enhance accuracy by leveraging a hybrid language model, combining the strengths of long short-term memory (LSTM) and gated recurrent unit (GRU). LSTM excels in preserving long-term dependencies in data, while GRU's simpler gating mechanism expedites the training process. The research endeavors to evaluate four variations of this hybrid model: LSTM, GRU, <a name="_Hlk158992630"></a>bidirectional long short-term memory (Bi-LSTM), and a combination of LSTM with GRU. These models are subjected to rigorous testing on two distinct datasets: one focused on IBM stock price prediction, and the other on Jigsaw toxic comment classification (sentiment analysis). This work represents a significant stride towards democratizing NLP capabilities, ensuring that even in resource-constrained settings, NLP models can exhibit improved performance. The anticipated implications of these findings span a wide spectrum of real-world applications and hold the potential to stimulate further research in the field of NLP. </span>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call