Abstract
The rapid spread of the new pandemic, i.e., coronavirus disease 2019 (COVID-19), has severely threatened global health. Deep-learning-based computer-aided screening, e.g., COVID-19 infected area segmentation from computed tomography (CT) image, has attracted much attention by serving as an adjunct to increase the accuracy of COVID-19 screening and clinical diagnosis. Although lesion segmentation is a hot topic, traditional deep learning methods are usually data-hungry with millions of parameters, easy to overfit under limited available COVID-19 training data. On the other hand, fast training/testing and low computational cost are also necessary for quick deployment and development of COVID-19 screening systems, but traditional methods are usually computationally intensive. To address the above two problems, we propose MiniSeg, a lightweight model for efficient COVID-19 segmentation from CT images. Our efforts start with the design of an attentive hierarchical spatial pyramid (AHSP) module for lightweight, efficient, effective multiscale learning that is essential for image segmentation. Then, we build a two-path (TP) encoder for deep feature extraction, where one path uses AHSP modules for learning multiscale contextual features and the other is a shallow convolutional path for capturing fine details. The two paths interact with each other for learning effective representations. Based on the extracted features, a simple decoder is added for COVID-19 segmentation. For comparing MiniSeg to previous methods, we build a comprehensive COVID-19 segmentation benchmark. Extensive experiments demonstrate that the proposed MiniSeg achieves better accuracy because its only 83k parameters make it less prone to overfitting. Its high efficiency also makes it easy to deploy and develop. The code has been released at https://github.com/yun-liu/MiniSeg.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE transactions on neural networks and learning systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.