In current relation extraction tasks, when the input sentence structure is complex, the performance of in-context learning methods based on large language model is still lower than that of traditional pre-train fine-tune models. For complex sentence structures, dependency syntax information can provide effective prior text structure information for relation extraction. However, most studies are affected by the noise in the syntactic information automatically extracted by natural language processing toolkits. Additionally, traditional pre-training encoders have issues such as an overly centralized representation of word embedding for high-frequency words, which adversely affects the model to learn contextual semantic information. To address proposed problem, the paper proposes a Hyperbolic Graph Convolutional Network Relation Extraction Model Combine Dependency Syntax and Contrastive Learning. Based on the hyperbolic graph neural network, dependent syntactic information and information optimization strategies are introduced to solve the problem of word embedding concentration. Simultaneously, to mitigate the impact of noise in dependency syntax information on the relation extraction task, a contrastive learning approach is employed. After the model learns context semantics separately in the original dependency syntax information and dependency syntax information with added random noise, it maximizes the mutual information between entity words to assist the model in distinguishing noise in dependency syntax. The experiments indicate that the proposed model in this paper can effectively enhance the performance of relation extraction on public datasets, especially achieving significantly higher precision on datasets with complex sentence structures compared to in-context learning.
Read full abstract