Recent improvements in medical image analysis using deep learning-based neural networks can potentially be exploited to enhance the performance of computer-aided detection/diagnosis systems. In this study, we propose a feature space transfer mode (FSTM) for learning the phenotype relationships between radiological images and pathological images. We hypothesize that high-level features from the same patient can be linked between different modality images with different resolutions. We refer to our method as “augmented radiology” because the inference model only requires radiological images as input while the prediction result can be linked to specific pathological phenotypes. We applied the proposed method to the pathological tumor classification (T0 vs. T2c/T3a and T0 vs. T2c vs. T3a) of prostate cancer and found that it achieved a high classification accuracy (0.880 for T0 vs. T2c/T3a and 0.825 for T0 vs. T2c vs. T3a) given only the radiological images as input. We also analyzed the validity of the proposed method by visualizing the transferred features and found that it can extract useful information for diagnosis embedded in radiological images. We conclude that the proposed method will significantly help improve the diagnostic prediction performance of radiological images.
Read full abstract