Cerebral Microbleeds (CMBs) are chronic deposits of small blood products in the brain tissues, which have explicit relation to various cerebrovascular diseases depending on their anatomical location, including cognitive decline, intracerebral hemorrhage, and cerebral infarction. However, manual detection of CMBs is a time consuming and error-prone process because of their sparse and tiny structural properties. The detection of CMBs is commonly affected by the presence of many CMB mimics that cause a high false-positive rate (FPR), such as calcifications and pial vessels. This paper proposes a novel 3D deep learning framework that not only detects CMBs but also identifies their anatomical location in the brain (i.e., lobar, deep, and infratentorial regions). For the CMBs detection task, we propose a single end-to-end model by leveraging the 3D U-Net as a backbone with Region Proposal Network (RPN). To significantly reduce the false positives within the same single model, we develop a new scheme, containing Feature Fusion Module (FFM) that detects small candidates utilizing contextual information and Hard Sample Prototype Learning (HSPL) that mines CMB mimics and generates additional loss term called concentration loss using Convolutional Prototype Learning (CPL). For the anatomical localization task, we exploit the 3D U-Net segmentation network to segment anatomical structures of the brain. This task not only identifies to which region the CMBs belong but also eliminates some false positives from the detection task by leveraging anatomical information. We utilize Susceptibility-Weighted Imaging (SWI) and phase images as 3D input to efficiently capture 3D information. The results show that the proposed RPN that utilizes the FFM and HSPL outperforms the baseline RPN and achieves a sensitivity of 94.66% vs. 93.33% and an average number of false positives per subject (FPavg) of 0.86 vs. 14.73. Furthermore, the anatomical localization task enhances the detection performance by reducing the FPavg to 0.56 while maintaining the sensitivity of 94.66%.
Read full abstract