Background and Objective:Biomedical relation extraction aims to reveal the relation between entities in medical texts. Currently, the relation extraction models that have attracted much attention are mainly to fine-tune the pre-trained language models (PLMs) or add template prompt learning, which also limits the ability of the model to deal with grammatical dependencies. Graph convolutional networks (GCNs) can play an important role in processing syntactic dependencies in biomedical texts. Methods:In this work, we propose a biomedical relation extraction model that fuses GCNs enhanced prompt learning to handle limitations in syntactic dependencies and achieve good performance. Specifically, we propose a model that combines prompt learning with GCNs for relation extraction, by integrating the syntactic dependency information analyzed by GCNs into the prompt learning model, by predicting the correspondence with [MASK] tokens labels for relation extraction. Results:Our model achieved F1 scores of 85.57%, 80.15%, 95.10%, and 84.11% in the biomedical relation extraction datasets GAD, ChemProt, PGR, and DDI, respectively, all of which outperform some existing baseline models. Conclusions:In this paper, we propose enhancing prompt learning through GCNs, integrating syntactic information into biomedical relation extraction tasks. Experimental results show that our proposed method achieves excellent performance in the biomedical relation extraction task.