Abstract

Due to the self-learning ability and probability to get more precise outcomes within less time span in the detection of abnormal tissues from a large amount of brain MR images, the automated segmentation approaches using various deep learning techniques like CNN, AlexNet, ResNet, Inception, etc. are proven more efficient nowadays in accurate diagnosis, risk assessment and treatment planning by medical practitioners. Here, multiple approaches used for the same domain will be reviewed with their diagnostic efficiency. Also, an approach will be proposed based on the recent study of deep learning, use it for analyzing the problem and compare results with different other methods so it will help make more truthful decisions to diagnose the actual problem. T1/T2 weighted and FLAIR MR images have been included for experimental study from different imaging centers, as they provide rich and structural information about the anatomy of brain tissues compared to other imaging modalities like PET, CT, etc. Various classified results are compared with manual segmentation results by radiologists and they were found accurate in terms of detection and segmentation of abnormal tissues. This will lead to improved treatment planning, monitoring and clinical acceptance of proposed methods depending upon the simplicity of computation and degree of involvement of the user in the entire process. Also, here hybrid semantic segmentation approach—RRDNN with U-Net (Recurrent Residual Deep Neural Network with U-Net) is introduced. To train a model in detail, the residual unit plays an important role, while recurrent residual layers with U-Net promises effective segmentation representation. Using this approach, higher classification accuracy up to 94% can be attained compared to existing approaches on the same platform in abnormal tissue identification from brain MR images.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.