Abstract

State-of-the-art semantic role labeling (SRL) performance has been achieved using neural network models by incorporating syntactic feature information such as dependency trees. In recent years, breakthroughs achieved using end-to-end neural network models have resulted in a state-of-the-art SRL performance even without syntactic features. With the advent of a language model called bidirectional encoder representations from transformers (BERT), another breakthrough was witnessed. Even though the semantic information of each word constituting a sentence is important in determining the meaning of a word, previous studies regarding the end-to-end neural network method did not utilize semantic information. In this study, we propose a BERT-based SRL model that uses simple semantic information without syntactic feature information. To obtain the latter, we used PropBank, which described the relational information between predicates and arguments. In addition, text-originated feature information obtained from the training text data was utilized. Our proposed model achieved state-of-the-art results on both Korean PropBank and CoNLL-2009 English benchmarks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call