In the last several years, the world of medical technology has seen a boom in multimodal picture fusion. Information is constrained since a single medical instrument can only acquire single modal pictures. Doctors often need a large number of multimodal pictures to get the complete information necessary for disease diagnosis. The burden associated with illness diagnosis will significantly rise when multimodal pictures are employed directly, and errors and interference are likely to occur. Fusion algorithms, which have been extensively employed in the medical industry, may very effectively combine a lot of information in multimodal pictures. However, the existing method has an issue with the earlier stages of brain tumor prediction in white images and inaccuracy image results. To overcome the above-mentioned problems, in this work, Adaptive Firefly Optimization based Convolutional Neural Network (AFFOCNN) and Modified Fully Connected Layer (MFCL) scheme is proposed. This work contains main steps such as noise removal, segmentation, feature extraction, image fusion, and image classification process. Initially, noise removal is done for improving the image quality by removing the noise. Then the modality MRI images are segmented and it is used for subdividing an image into its constituent regions or object. It segments the image into black and white images. After that, feature extraction is applied through the AFFOCNN algorithm which extracts the more informative image features. Image fusion of multi-modal images derived the lower-level, middle-level, and higher-level image contents. It can be viewed in multiple directions and fused in all directions. Finally, image classification is performed by using a Modified Fully Connected Layer (MFCL) which improves the training and testing features efficiently. It was determined from the results that the suggested combination of AFFOCN and MFCL algorithm performs improved than the current algorithms with the increased accuracy, precision, recall, and mean square error (MSE), as well as execution time with the values of 99.00%, 98.00%, 96.00%, 12.00% and 2.40 seconds respectively.
Read full abstract