Abstract

Text classification and generation are two important tasks in the field of natural language processing. In this paper, we deal with both tasks via Variational Autoencoder, which is a powerful deep generative model. The self-attention mechanism is introduced to the encoder. The modified encoder extracts the global feature of the input text to produce the hidden code, and we train a neural network classifier based on the hidden code to perform the classification. On the other hand, the label of the text is fed into the decoder explicitly to enhance the categorization information, which could help with text generation. The experiments have shown that our model could achieve competitive classification results and the generated text is realistic. Thus the proposed integrated deep generative model could be an alternative for both tasks.

Highlights

  • Text classification is one of the most basic and important tasks in the field of natural language processing, in which one should assign predefined categories to the text

  • This time we compared some best methods based on neural networks, such as the Recursive Autoencoders (RAE) with pretrained word vectors from Wikipedia, Matrix-Vector Recursive Neural Network (MV-recurrent neural network (RNN)) with parse trees, Recursive Neural Tensor Network (RNTN) with tensor based feature function and parse trees, and Dynamic Convolutional Neural Network (DCNN) with k-max pooling

  • We have proposed an integrated deep generative model to deal with both the text classification and generation

Read more

Summary

Introduction

Text classification is one of the most basic and important tasks in the field of natural language processing, in which one should assign predefined categories to the text. Deep learning models based on neural networks have achieved remarkable results in various tasks, such as computer vision [5] and speech recognition [6]. Variational Autoencoder (VAE), which is a powerful deep generative model, has attracted the attention of many researchers in recent years [19, 20]. It consists of a probability encoder and a probability decoder and takes advantages of the variational inference. We propose an integrated model based on VAE to handle both generation and classification tasks.

Review of VAE
Recurrent Neural Network
Details of the Model
Related Works
Experiments
Methods
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call