Abstract

Recurrent neural networks (RNNs) have shown outstanding performance for natural language processing tasks, influenced by the repeated multiplication of the recurrent weight matrices, the problem of gradient vanishing and explosion problem will be encountered when training RNN. Independently recurrent neural network (IndRNN) makes neurons independent and constrains recursive weights to effectively solve gradient problems. We combine IndRNN with long short-term memory (LSTM) and attention model, and test it for text classification, the results show that our models can effectively adapt to text classification task.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call