Abstract

Lung cancer is one of the most prevalent diseases worldwide, being the most common type of cancer. Non-small cell lung cancer (NSCLC) with a five-year survival rate of less than 20% and widely varying treatment modalities is diagnosed in more than 85% of cases. In this study, a multi-feature multi-attention network (MFMANet) was proposed for NSCLC subtype classification to address the problem of low classification accuracy between subtypes due to small lesion size and similar background. In MFMANet, the multi-scale spatial channel attention module (MSAM) and multi-feature fusion global local attention module (MFGLA), are proposed. MSAM is able to preserve spatial features during channel feature fusion to capture high-order statistical information. MFGLA performs effective fusion of multiple features to avoid interference caused by scale differences. The global and local information is extracted by global and local attention branches to enhance the perception of small lesion regions. The performance of the proposed MFMANet was validated on two public datasets and compared with other classification networks including ResNet18, ShuffleNetv2, MobileNetv3, and MnasNet. MFMANet achieved 99.06% and 91.67% accuracy in the binary classification of CT images of lung adenocarcinoma (ADC) and lung squamous cell carcinoma (SCC), outperforming other methods. This study confirms that the proposed MFMANet provides effective solution for the subtypes classification of NSCLC.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.