Abstract

Relation classification is an important ingredient task in the construction of knowledge graph, question answering system and numerous other natural language processing (NLP) tasks. With the application of deep neural networks (DNNs) such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs), relation classification task has achieved satisfactory results. However, many proposed models can not take well advantages of multiple window sizes for filters in CNNs and finally hurt the performance of this task. Moreover, unlike public and general dataset that has a large quantity of instances from natural languages or daily conversations, the performances of many deep neural networks with high complexity are not well enough for a small corpus in specific fields. To work out these problems, we propose a novel CNN model with attention mechanism for multi-window-sized kernels to capture the most important information and test our system not only on a general dataset of SemEval 2010 but also on a small dataset built from Chinese fundamentals of electric circuits textbook artificially. The experimental results show that our system outperforms the baseline systems for the SemEval 2010 relation classification task and validate the effectiveness of CNN on the specific Chinese small corpus relation classification task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call