Abstract
Most primary brain malignancies are malignant tumors characterized by masses of abnormal tissue that grow uncontrollably. Recently, deep transfer learning (DTL) has been considered in the automated clinical application of magnetic resonance imaging (MRI) for determining brain tumor characteristics. Several previous studies have shown that tumors analyzed in MR images usually have local features, are extensive, and are associated with a high level of uncertainty. Accordingly, we introduce a two-branch parallel model that integrates the Transformer Module (TM) with the Self-Attention Unit (SAU) and Convolutional Neural Networks (CNN) to classify brain tumors in MR images. We also propose a novel approach that combines local and global features derived from CNNs and TMs to improve classification accuracy through a cross-fusion strategy. Hybrid architecture combined with cross-fusion allows parallel systems to be merged between branches, resulting in a pattern that identifies various types of tumors. Additionally, we developed a lightweight and improved CNN architecture (iResNet) that distinguishes tumor features based on MR images. From the 3064 slices in the four-class MR images from the Figshare dataset, 20 % were considered unseen images. The remainder was divided into 60 %, 20 %, and 20 % slices for training, validation, and testing. The overall structure was combined with other similar CNN networks (i.e., DenseNet, VGG, and ResNet), and we found that the accuracy of iVGG, iDensNet, and iResNet is 98.59, 98.94, and 99.30 %, respectively. Using local and global features, we developed an accurate and generalizable model to detect brain tumors in MRI, allowing rapid and accurate diagnosis.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.