Abstract

The methods based on the combination of word-level and character-level features can effectively boost performance on Chinese short text classification. A lot of works concatenate two-level features with little processing, which leads to losing feature information. In this work, we propose a novel framework called Mutual-Attention Convolutional Neural Networks, which integrates word and character-level features without losing too much feature information. We first generate two matrices with aligned information of two-level features by multiplying word and character features with a trainable matrix. Then, we stack them as a three-dimensional tensor. Finally, we generate the integrated features using a convolutional neural network. Extensive experiments on six public datasets demonstrate improved performance of our new framework over current methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call