Abstract

Automatic extraction of high-quality biomedical entity relations from biomedical texts plays an important role in biomedical text mining. Currently, existing methods generally focus on training a single task model for a specific task (e.g., drug-drug interaction extraction, protein-protein interaction extraction), ignoring the correlation among multiple tasks. To solve the problem, we used neural network-based multi-task learning method to explore the correlation among multiple biomedical relation extraction tasks. In our study, we constructed a fully-shared model (FSM) and a shared-private model (SPM) and further proposed an attention-based main-auxiliary model (Att-MAM). Experimental results on five public biomedical relation extraction datasets show that the multi-task learning can effectively learn the shared information among multiple tasks and obtain better performance than the single task method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call