Abstract

Text representation is one of the fundamental problems in text analysis tasks. The key of text representation is to extract and express the semantic and syntax feature of texts. The order-sensitive sequence models based on neural networks have achieved great progress in text representation. Bidirectional Long Short-Term Memory (BiLSTM) Neural Networks, as an extension of Recurrent Neural Networks (RNN), not only can deal with variable-length texts, capture the long-term dependencies in texts, but also model the forward and backward sequence contexts. Moreover, typical neural networks, Convolutional Neural Networks (CNN), can extract more semantic and structural information from texts, because of their convolution and pooling operations. The paper proposes a hybrid model, which combines the BiLSTM with 2-dimensial convolution and 1-dimensial pooling operations. In other words, the model firstly captures the abstract representation vector of texts by the BiLSTM, and then extracts text semantic features by 2-dimensial convolutional and 1-dimensial pooling operations. Experiments on text classification tasks show that our method obtains preferable performances compared with the state-of-the-art models when applied on the MR1 sentence polarity dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call