Abstract

Text classification is the process of discriminating predetermined text into a certain class or some certain classes. Text categorization has important applications in redundant filtering, organization management, information retrieval, index building, ambiguity resolution, and text filtering. This paper we will mainly introduce the research background of text classification and tracks the research dynamics of text classification at home and abroad. Text classification is an essential component in many NLP problems. Neural network model had achieved extraordinary effect in text classification. So we will discuss how the general methods with deep learning to deal with text classification problems, including Convolution Neural Network(CNN), Recurrent convolution neural network(RCNN), Long Short-Term memory(LSTM), and fastCNN. CNN can construct the representation of text using a convolutional neural network. RNN does well in capturing contextual information. LSTM is explicitly designed for time-series data for learning long-term dependencies. Besides, we will introduce the distributed representation, such as Continuous Bags of Words(CBOW) and Skip-Gram. And analyze the advantages of word2vec model over on-hot encoding.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.