Abstract

The development of Computer-aided diagnosis (CAD) systems for automatic lung nodule detection through thoracic computed tomography (CT) scans has been an active area of research in recent years. Lung Nodule Analysis 2016 (LUNA16 challenge) encourages researchers to suggest a variety of successful nodule detection algorithms based on two key stages (1) candidates detection, (2) false-positive reduction. In the scope of this paper, a new convolutional neural network (CNN) architecture is proposed to efficiently solve the second challenge of LUNA16. Specifically, we find that typical CNN models pay little attention to the characteristics of input data, in order to address this constraint, we apply the attention-mechanism: propose a technique to attach Squeeze and Excitation-Block (SE-Block) after each convolution layer of CNN to emphasize important feature maps related to the characteristics of the input image - forming Attention sub-Convnet. The new CNN architecture is suggested by connecting the Attention sub-Convnets. In addition, we also analyze the selection of triplet loss or softmax loss functions to boost the rating performance of the proposed CNN. From the study, this is agreed to select softmax loss during the CNN training phase and triplet loss for the testing phase. Our suggested CNN is used to minimize the number of redundant candidates in order to improve the efficiency of false-positive reduction with the LUNA database. The results obtained in comparison to the previous models indicate the feasibility of the proposed model.

Highlights

  • Lung cancer is the leading cause of death from cancer for both men and women

  • We use the reformatted version, LUNA16. This dataset includes 888 computed tomography (CT) scans with notes describing the coordinates of the nodule region and whether or not the image is labeled as a nodule

  • Experiments showed that AST, the proposed softmax network trained with a combination of multi-Attention sub-convnets with double fully connected and validated by triplet loss gave us the best result

Read more

Summary

Introduction

Lung cancer is the leading cause of death from cancer for both men and women. According to the annual statistical report on the number of patients with lung cancer of the American Institute for Cancer Research, there were 2 million new cases in 2018 1. The motivation is that inside a feature map with C channels, not all C are crucial to the final decision of a deep network. There are some important ones that should be focused on It needs an attention (self-attention) mechanism so as to emphasize to those essential ones. Authors proposed a block (as ResNet block [10]), called Squeeze-andExcitation block This block can be built upon a transformation Ftr to map the input feature X ∈ RH′×W′×C′to output feature U ∈ RH×W×C. The final sigmoid function constraints the value rage of the vector s within [0,1]. After obtaining the attention gate s, the attentioned output feature can be computed as: X = s × U. By multiplying the output feature U with the vector s, less crucial channels will be suppressed, while other important ones will be remained

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call