Abstract

Metaphors are ubiquitous in natural language, and detecting them requires contextual reasoning about whether a semantic incongruence actually exists. Most existing work addresses this problem using pre-trained contextualized models. Despite their success, these models require a large amount of labeled data and are not linguistically-based. In this paper, we proposed a ContrAstive pre-Trained modEl (CATE) for metaphor detection with semi-supervised learning. Our model first uses a pre-trained model to obtain a contextual representation of target words and employs a contrastive objective to promote an increased distance between target words’ literal and metaphorical senses based on linguistic theories. Furthermore, we propose a simple strategy to collect large-scale candidate instances from the general corpus and generalize the model via self-training. Extensive experiments show that CATE achieves better performance against state-of-the-art baselines on several benchmark datasets.

Highlights

  • Introduction plying deep learning techniques forMD (Wu et al, 2018; Gao et al, 2018; Mao et al, 2019; Rohanian et al, 2020; Le et al, 2020)

  • To address the label scarcity issue, we propose a simple target-based generating strategy to automatically generate training data inspired by a distantly supervised paradigm (Mintz et al, 2009; Hoffmann et al, 2011)

  • We can found that ContrAstive Pre-Trained ModEl (CATE) achieves strong performance on all datasets, is superior to existing models on 3 out of 4 datasets in terms of F1-score, and achieves similar performance on VUA VERB with MelBERT

Read more

Summary

Introduction

MD (Wu et al, 2018; Gao et al, 2018; Mao et al, 2019; Rohanian et al, 2020; Le et al, 2020) These methods directly embed textual semantic information into a low-dimensional space by deep neural networks. In the sentence “I have digested all this information,” the word digested does not literally mean converting food into absorbable substances. Instead, this word means “arrange and integrate in the mind” in the context.. This word means “arrange and integrate in the mind” in the context.1 This metaphor conceptualizes the concept of ideas in terms of the properties of food.

Objectives
Methods
Findings
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call