Abstract

Long Short-Term Memory (LSTM) networks are unique to exercise data in its memory cell with long-term memory as Natural Language Processing (NLP) tasks have inklings of intensive time and computational power due to their complex structures like magnitude language model Transformer required to pre-train and learn billions of data performing different NLP tasks. In this paper, a dynamic chaotic model is proposed for the objective of transforming neurons states in network with neural dynamic characteristics by restructuring LSTM as Chaotic Neural Oscillatory-Long-Short Term Memory (CNO-LSTM), where neurons in LSTM memory cells are weighed in substitutes by oscillatory neurons to speed up computational training of language model and improve text classification accuracy for real-world applications. From the implementation perspective, five popular datasets of general text classification including binary, multi classification and multi-label classification are used to compare with mainstream baseline models on NLP tasks. Results showed that the performance of CNO-LSTM, a simplified model structure and oscillatory neurons state in exercising different types of text classification tasks are above baseline models in terms of evaluation index such as Accuracy, Precision, Recall and F1. The main contributions are time reduction and improved accuracy. It achieved approximately 46.76% of the highest reduction training time and 2.55% accuracy compared with vanilla LSTM model. Further, it achieved approximately 35.86% in time reduction compared with attention model without oscillatory indicating that the model restructure has reduced GPU dependency to improve training accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call