Abstract

The structure of a sentence conveys rich linguistic knowledge and has proven useful for natural language understanding. In this paper, we aim to incorporate syntactical constraints and long-range word dependencies into the sentence encoding procedure using the widely applied Graph Convolutional Network (GCN) and word dependency trees. Existing syntax-aware GCN methods construct the adjacency matrix by referring to whether two words are connected in the dependency tree. But they fail to model the word dependency type, which reflects how the words are linked in dependency trees. They cannot distinguish the different contributions of different word dependency paths. To avoid introducing redundant word dependencies that harm language understanding, we propose a GCN version that is extended by a novel Word Dependency Gate mechanism. Word Dependency Gate can adaptively maintain the balance between the inclusion and exclusion of specific word dependency paths based on the word dependency type and its word context. Experiments show that our approach can effectively incorporate the relevant syntactical dependency in BERT and achieve a state-of-the-art performance in the End-to-End Aspect-Based Sentiment Analysis and Relation Triple Extraction tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call