Abstract

PurposeThe diagnosis of prostate transition zone cancer (PTZC) remains a clinical challenge due to their similarity to benign prostatic hyperplasia (BPH) on MRI. The Deep Convolutional Neural Networks (DCNNs) showed high efficacy in diagnosing PTZC on medical imaging but was limited by the small data size. A transfer learning (TL) method was combined with deep learning to overcome this challenge.Materials and methodsA retrospective investigation was conducted on 217 patients enrolled from our hospital database (208 patients) and The Cancer Imaging Archive (nine patients). Using T2-weighted images (T2WIs) and apparent diffusion coefficient (ADC) maps, DCNN models were trained and compared between different TL databases (ImageNet vs. disease-related images) and protocols (from scratch, fine-tuning, or transductive transferring).ResultsPTZC and BPH can be classified through traditional DCNN. The efficacy of TL from natural images was limited but improved by transferring knowledge from the disease-related images. Furthermore, transductive TL from disease-related images had comparable efficacy to the fine-tuning method. Limitations include retrospective design and a relatively small sample size.ConclusionDeep TL from disease-related images is a powerful tool for an automated PTZC diagnostic system. In developing regions where only conventional MR scans are available, the accurate diagnosis of PTZC can be achieved via transductive deep TL from disease-related images.

Highlights

  • About 25% of prostate cancers originate in the transition zone (TZ), and their diagnosis remains a clinical challenge due to the similarity on MRI to benign prostatic hyperplasia (BPH) [1]

  • prostate transition zone cancer (PTZC) and BPH can be classified through traditional Deep Convolutional Neural Networks (DCNNs)

  • Based on the current investigation, we revealed that PTZC and BPH can be distinguished through traditional DCNN

Read more

Summary

Introduction

About 25% of prostate cancers originate in the transition zone (TZ), and their diagnosis remains a clinical challenge due to the similarity on MRI to benign prostatic hyperplasia (BPH) [1]. The conventional transrectal ultrasound-guided biopsy faces the dilemma of both underdiagnosis and overdiagnosis because it is invasive, the tumor may be small, and there are inherent limitations of H&E slides [2]. Several machine learning methods have been developed to classify prostate cancer from normal tissue or BPH [3]. Traditional machine learning methods are laborious because of complex feature extraction procedures [4]. The selection of features may be influenced by different data sources and processing software, its generalization is limited. The Deep Convolutional Neural Networks (DCNNs) automatically extract medical imaging diagnostic features based on fixed architectures [5,6].

Methods
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call