Abstract
Automatic segmentation of the brain Magnetic Resonance Imaging (MRI) plays a crucial role in many brain MRI processing algorithms, which is effective for the prevention, detection, monitoring, and treatment planning of brain disease. Currently, deep learning algorithms have shown outstanding performance in brain segmentation. Most algorithms train models with fully annotated brain MRI datasets. However, full annotation of 3D brain MRI is laborious and time-consuming. Using sparsely annotated datasets to train models may be one appropriate solution to reduce annotation cost. However, the approach of 3D dense segmentation with sparse annotation has not been fully studied. In this paper, we develop a segmentation framework combined with the quality-driven active learning (QDAL) module for suggestive annotation. In the proposed Active Learning module, attention mechanism and deep supervision mode are used to improve the segmentation accuracy and feedback segmentation quality information. Meanwhile, we observe a high correlation coefficient between the proposed two surrogate metrics and the real segmentation accuracy of per slice in one scan. We validate our framework on two public brain MRI datasets for brain region extraction and brain tissue segmentation. The comparative experiments demonstrate that the QDAL method outperforms the other four popular sampling strategies. The segmentation network with the guidance of the QDAL method only needs 15–20% annotated slices in brain extraction task, and 30–40% annotated slices in tissue segmentation task to achieve competitive results compared with training with full supervision.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.