Abstract

Objective. Accurate and automatic detection of pulmonary nodules is critical for early lung cancer diagnosis, and promising progress has been achieved in developing effective deep models for nodule detection. However, most existing nodule detection methods merely focus on integrating elaborately designed feature extraction modules into the backbone of the detection network to extract rich nodule features while ignore disadvantages of the structure of detection network itself. This study aims to address these disadvantages and develop a deep learning-based algorithm for pulmonary nodule detection to improve the accuracy of early lung cancer diagnosis. Approach. In this paper, an S-shaped network called S-Net is developed with the U-shaped network as backbone, where an information fusion branch is used to propagate lower-level details and positional information critical for nodule detection to higher-level feature maps, head shared scale adaptive detection strategy is utilized to capture information from different scales for better detecting nodules with different shapes and sizes and the feature decoupling detection head is used to allow the classification and regression branches to focus on the information required for their respective tasks. A hybrid loss function is utilized to fully exploit the interplay between the classification and regression branches. Main results. The proposed S-Net network with ResSENet and other three U-shaped backbones from SANet, OSAF-YOLOv3 and MSANet (R+SC+ECA) models achieve average CPM scores of 0.914, 0.915, 0.917 and 0.923 on the LUNA16 dataset, which are significantly higher than those achieved with other existing state-of-the-art models. Significance. The experimental results demonstrate that our proposed method effectively improves nodule detection performance, which implies potential applications of the proposed method in clinical practice.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.