Abstract

Pretrained language models have achieved widespread success on various natural language processing tasks. In the biomedical domain, one line of research is to utilize a large amount of in-domain corpus for pre-training.While these models achieved remarkable improvement on in-domain tasks, they do not take into account the positive role of large-scale in-domain knowledge bases. Integrating biomedical knowledge in the knowledge base like the Unified Medical Language System(UMLS) into these models can further benefit in-domain downstream tasks, such as biomedical named entities and relation extraction. To this end, we proposed BioELM, a pre-trained language model based on entity linking that explicitly leverages knowledge from the UMLS knowledge base. We utilize a two-layer entity-linking structure to integrate entity representations. To optimize the pre-training process, we optimized the masked language modeling and added two training objectives as named entity recognition and entity linking. We validate the performance of our BioELM on named entity recognition and relation extraction tasks on the BLURB benchmark. The experimental results demonstrate that the pre-training tasks and entity-linking strategy on BioELM can improve the performance on both biomedical named entity recognition and relation extraction tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call