Abstract

Segmenting brain tumours in medical imaging is a crucial job. The ability to improve treatment options and boost patient survival rates depends on the early diagnosis of brain tumours. It is difficult and tedious task for segmenting the tumours for cancer diagnosis from a huge amount of MRI (Magnetic Resonance Imaging) images acquired in clinical practice. Therefore, automatic brain tumour segmentation techniques are needed. Deep learning algorithms for automatic tumor segmentation have lately grown in popularity as they produce cutting-edge results and are more effective than alternative techniques at solving this issue. Most of the recent researches used four MRI imaging modalities such as T1, T1c, T2, and FLAIR, because each delivers distinct and crucial characteristics relating to each area of the tumor. Even though several of the studies had better segmentation on the dataset utilized, they are having a most complicated network structure and they requires more training and testing time. As a result, a simple and novel JGate-AttResUNet network design is constructed in the proposed work to produce a robust and reliable brain tumour segmentation system. This method provides more effective and precise localization of tumor when compared with other models. For that J-Gate attention method is used to enhance the tumour localization. The experiments show that the suggested model generates competitive outcomes using the BRATS 2015 and 2019 dataset. For the BRATS 2015 and BRATS 2019 dataset, the designed model produces mean dice values of 0.896 and 0.913, respectively. The additional quantitative and qualitative assessments were discussed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.