Abstract

BackgroundBiomedical named entity recognition(BNER) is a crucial initial step of information extraction in biomedical domain. The task is typically modeled as a sequence labeling problem. Various machine learning algorithms, such as Conditional Random Fields (CRFs), have been successfully used for this task. However, these state-of-the-art BNER systems largely depend on hand-crafted features.ResultsWe present a recurrent neural network (RNN) framework based on word embeddings and character representation. On top of the neural network architecture, we use a CRF layer to jointly decode labels for the whole sentence. In our approach, contextual information from both directions and long-range dependencies in the sequence, which is useful for this task, can be well modeled by bidirectional variation and long short-term memory (LSTM) unit, respectively. Although our models use word embeddings and character embeddings as the only features, the bidirectional LSTM-RNN (BLSTM-RNN) model achieves state-of-the-art performance — 86.55% F1 on BioCreative II gene mention (GM) corpus and 73.79% F1 on JNLPBA 2004 corpus.ConclusionsOur neural network architecture can be successfully used for BNER without any manual feature engineering. Experimental results show that domain-specific pre-trained word embeddings and character-level representation can improve the performance of the LSTM-RNN models. On the GM corpus, we achieve comparable performance compared with other systems using complex hand-crafted features. Considering the JNLPBA corpus, our model achieves the best results, outperforming the previously top performing systems. The source code of our method is freely available under GPL at https://github.com/lvchen1989/BNER.

Highlights

  • Biomedical named entity recognition(BNER) is a crucial initial step of information extraction in biomedical domain

  • We evaluate our model on two BNER shared tasks — BioCreative II gene mention (GM) task and JNLPBA 2004 task

  • Data sets We evaluate our neural network model on two publicly available corpora: the BioCreAtIvE II GM corpus and JNLPBA corpus, for system comparison with existing BNER tools

Read more

Summary

Introduction

Biomedical named entity recognition(BNER) is a crucial initial step of information extraction in biomedical domain. Various machine learning algorithms, such as Conditional Random Fields (CRFs), have been successfully used for this task. These state-of-the-art BNER systems largely depend on hand-crafted features. Biomedical named entity recognition (BNER), which recognizes important biomedical entities (e.g. genes and proteins) from text, is a essential step in biomedical information extraction. Recurrent neural network (RNN) [16] and its variants long-short term memory (LSTM) [17] have been successfully used in various sequence prediction problems, such as general domain NER [18, 19], language modeling [20, 21] and speech recognition [22]

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call