Abstract

Pretrained language models (PLMs) have been developed rapidly which establish impressive performance on many open-domain downstream tasks. However, conducting these pretrained models directly without additional network architectures on special domain tasks like multi-label disease diagnosis cannot perform well. Recently, prompt learning has been a new paradigm in PLM field which is more convenient and well-performed than the traditional fine-tuning approach for different domain tasks. However, prompt engineering is challenging because it takes time and experience. In this paper, we propose a new prompt learning method named Knowledge-based Dynamic PrompT (KBDPT) to deal with these problems. Firstly, we import medical knowledge into PLMs by prompt templates which make results of the disease diagnosis more reasonable and qualified. Compared with the fine-tuning approach, this method needs fewer trainable parameters and less training data but achieve better performance. Secondly, unlike most existing pre-defined prompt methods, KBDPT dynamically generates prompts based on personal medical information and a large-scale medical knowledge graph, which can provide more valuable guidance information for disease diagnosis. Lastly, the proposed model also ensembles multiple prompts from all possible diseases to introduce more knowledge and obtain differential diagnosis results. Experiments of multi-label disease diagnosis are conduct on three real-world EMR datasets. Results demonstrate that our model can be used in various pretrained models and outperform both classical deep learning methods and fine-tuning PLMs. The source code of our proposed model has been released at: https://github.com/loxs123/KBDPT.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call