Abstract

We present a neural exhaustive approach that addresses named entity recognition (NER) and relation recognition (RE), for the entity and re- lation recognition over the wet-lab protocols shared task. We introduce BERT-based neural exhaustive approach that enumerates all pos- sible spans as potential entity mentions and classifies them into entity types or no entity with deep neural networks to address NER. To solve relation extraction task, based on the NER predictions or given gold mentions we create all possible trigger-argument pairs and classify them into relation types or no relation. In NER task, we achieved 76.60% in terms of F-score as third rank system among the partic- ipated systems. In relation extraction task, we achieved 80.46% in terms of F-score as the top system in the relation extraction or recognition task. Besides we compare our model based on the wet lab protocols corpus (WLPC) with the WLPC baseline and dynamic graph-based in- formation extraction (DyGIE) systems.

Highlights

  • The entity and relation recognition over wet-lab protocol (Tabassum et al, 2020) shared task1 is an open challenge that allows participants to use any methodology and knowledge sources for the wet lab protocols that specify the steps in performing a lab procedure

  • Our model outperforms by 4.81% for named entity recognition (NER) and 7.79% for relation extraction (RE) over the wet lab protocols corpus (WLPC) baseline and 3.61% for NER over the dynamic graph-based information extraction (DyGIE) system

  • In order to evaluate the performance of NER, we conduct experiments on different sets of BERT-based learning representations, including PubmedBERT with merging training- and devset (PubmedBERT-Merge), PubmedBERT along with training (PubmedBERT-Train), SciBERT with merging training- and dev-set (SciBERT-Merge), and SciBERT along with training (SciBERT-Train)

Read more

Summary

Introduction

The entity and relation recognition over wet-lab protocol (Tabassum et al, 2020) shared task is an open challenge that allows participants to use any methodology and knowledge sources for the wet lab protocols that specify the steps in performing a lab procedure. The task aims at two sub-tasks in wet lab protocols domain: named entity recognition (NER), and relation recognition or extraction (RE). In NER, the task is to detect mentions and classify them into entity types or no entity. Relation extraction (RE) is a task to identify relation types between known or predicted entity mentions in a sentence. We present a BERT-based neural exhaustive approach that addresses both NER and RE tasks. We employ a neural exhaustive model (Sohrab and Miwa, 2018; Sohrab et al, 2019b) for NER and the extended model that addresses RE task. We compare our model with the state-of-the-art models over the wet lab protocols corpus (WLPC). We compare the WLPC baseline model based on LSTM-CRF and maximumentropy-based approaches to address NER and RE tasks respectfully. Our model outperforms by 4.81% for NER and 7.79% for RE over the WLPC baseline and 3.61% for NER over the DyGIE system

Related Work
Neural Exhaustive Approach for NER and Relation Extraction
BERT Layer
Entity Recognition layer
Relation Recognition Layer
Experimental Settings
Data Preprocessing
Training Settings
Results and Discussions
NER Performances
Relation Extraction Performances
Ablation Study
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call