Abstract

Pretrained language models have shown success in many natural language processing tasks. Many works explore incorporating knowledge into language models. In the biomedical domain, experts have taken decades of effort on building large-scale knowledge bases. For example, the Unified Medical Language System (UMLS) contains millions of entities with their synonyms and defines hundreds of relations among entities. Leveraging this knowledge can benefit a variety of downstream tasks such as named entity recognition and relation extraction. To this end, we propose KeBioLM, a biomedical pretrained language model that explicitly leverages knowledge from the UMLS knowledge bases. Specifically, we extract entities from PubMed abstracts and link them to UMLS. We then train a knowledge-aware language model that firstly applies a text-only encoding layer to learn entity representation and applies a text-entity fusion encoding to aggregate entity representation. Besides, we add two training objectives as entity detection and entity linking. Experiments on the named entity recognition and relation extraction from the BLURB benchmark demonstrate the effectiveness of our approach. Further analysis on a collected probing dataset shows that our model has better ability to model medical knowledge.

Highlights

  • Introduction fied Medical LanguageSystem (UMLS) (Bodenreider, 2004) that contains more than 4M entitiesLarge-scale pretrained language models (PLMs) with their synonyms and defines over 900 kinds of are proved to be effective in many natural language relations

  • We propose to combine the above two strategies for a better Knowledge enhanced Biomedical pretrained Language Model (KeBioLM)

  • We propose KeBioLM, a biomedical pretrained language model that explicitly incorporates knowledge from Unified Medical Language System (UMLS)

Read more

Summary

Biomedical PLMs

Models like ELMo (Peters et al, 2018) and BERT (Devlin et al, 2019) show the effectiveness of the paradigm of first pre-training an LM on the unlabeled text fine-tuning the model on the downstream NLP tasks. KnowBert (Peters et al, 2019) and Entity as Experts (EAE) (Févry et al, 2020) use the entity linker to perform entity disambiguation for candidate entity spans and enhance token representations using entity embeddings. Inspired by entity-enhanced PLMs, we follow the model of EAE to inject biomedical knowledge into KeBi-. Relation triplets provide intrinsic knowledge between entity pairs. KEPLER (Wang et al, 2019) learns the knowledge embeddings through relation triplets while pretraining. K-BERT (Liu et al, 2020) converts input sentences into sentence trees by relation triplets to infuse knowledge. He et al (2020) inject disease knowledge to existing PLMs by predicting diseases names and aspects on Wikipedia passages. We propose KeBioLM to infuse various kinds of biomedical knowledge from UMLS including but not limited to diseases

Approach
Pretraining Tasks
Data Creation
Named Entity Recognition
Ablation Test
Probing Results
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call