Abstract

In the field of deep learning, for problems and tasks that are sensitive to time series, such as natural language processing or speech recognition, the recurrent neural network is usually more suitable. Long short-term memory (LSTM) is a representative network structure in recurrent neural network. It is time-dependent and enables a global representation of features. However, some problems such as the network parameters of LSTMs limit the applicability of their solutions. This paper proposes an improved hybrid structure of graph convolutional neural network and recurrent neural network. In the input layer, a two-dimensional convolutional neural network is used to build a text corpus map. Graphic embedding is used to preserve the global structure of the entire text graph structures. The LSTM layer and attention mechanism are used to fully implement text classification and improve the computational efficiency. The test results show that the hybrid network structure has better operation speed on the IMDb dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call