Abstract

To assess whether transfer learning with a bidirectional encoder representations from transformers (BERT) model, pretrained on a clinical corpus, can perform sentence-level anatomic classification of free-text radiology reports, even for anatomic classes with few positive examples. This retrospective study included radiology reports of patients who underwent whole-body PET/CT imaging from December 2005 to December 2020. Each sentence in these reports (6272 sentences) was labeled by two annotators according to body part ("brain," "head & neck," "chest," "abdomen," "limbs," "spine," or "others"). The BERT-based transfer learning approach was compared with two baseline machine learning approaches: bidirectional long short-term memory (BiLSTM) and the count-based method. Area under the precision-recall curve (AUPRC) and area under the receiver operating characteristic curve (AUC) were computed for each approach, and AUCs were compared using the DeLong test. The BERT-based approach achieved a macro-averaged AUPRC of 0.88 for classification, outperforming the baselines. AUC results for BERT were significantly higher than those of BiLSTM for all classes and those of the count-based method for the "brain," "chest," "abdomen," and "others" classes (P values < .025). AUPRC results for BERT were superior to those of baselines even for classes with few labeled training data (brain: BERT, 0.95, BiLSTM, 0.11, count based, 0.41; limbs: BERT, 0.74, BiLSTM, 0.28, count based, 0.46; spine: BERT, 0.82, BiLSTM, 0.53, count based, 0.69). The BERT-based transfer learning approach outperformed the BiLSTM and count-based approaches in sentence-level anatomic classification of free-text radiology reports, even for anatomic classes with few labeled training data.Keywords: Anatomy, Comparative Studies, Technology Assessment, Transfer Learning Supplemental material is available for this article. © RSNA, 2023.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.