Abstract
Background:Automatic pulmonary artery–vein separation has considerable importance in the diagnosis and treatment of lung diseases. However, insufficient connectivity and spatial inconsistency have always been the problems of artery–vein separation. Methods:A novel automatic method for artery–vein separation in CT images is presented in this work. Specifically, a multi-scale information aggregated network (MSIA-Net) including multi-scale fusion blocks and deep supervision, is proposed to learn the features of artery–vein and aggregate additional semantic information, respectively. The proposed method integrates nine MSIA-Net models for artery–vein separation, vessel segmentation, and centerline separation tasks along with axial, coronal, and sagittal multi-view slices. First, the preliminary artery–vein separation results are obtained by the proposed multi-view fusion strategy (MVFS). Then, centerline correction algorithm (CCA) is used to correct the preliminary results of artery–vein separation by the centerline separation results. Finally, the vessel segmentation results are utilized to reconstruct the artery–vein morphology. In addition, weighted cross-entropy and dice loss are employed to solve the class imbalance problem. Results:We constructed 50 manually labeled contrast-enhanced computed CT scans for five-fold cross-validation, and experimental results demonstrated that our method achieves superior segmentation performance of 97.7%, 85.1%, and 84.9% on ACC, Pre, and DSC, respectively. Additionally, a series of ablation studies demonstrate the effectiveness of the proposed components. Conclusion:The proposed method can effectively solve the problem of insufficient vascular connectivity and correct the spatial inconsistency of artery–vein.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.