Abstract

Throughout the past few decades, artificial intelligence and machine learning have seen a lot of active research in areas such as computer vision, natural language processing, and speech processing. As a result, deep learning models became state-of-the-art for computer vision tasks such as object detection, classification, segmentation, and other allied tasks. Of course, the fruits of this research are extended to the design of robust and reliable digital health systems as well as other applications in the healthcare sector. Many clinical applications require the automatic segmentation of medical images. Recent deep learning-based approaches have demonstrated state-of-the-art performance in medical image segmentation tasks. In addition to their ability to automatically extract features and generalize over large amounts of data, transfer learning based deep learning models have proven to be handy for data scared areas like medical domains. In this research, we investigate and demonstrate the efficacy of a DCNN-based transfer learning model -Res101_Unet, which has been trained and/or fine-tuned to execute tumor tissue segmentation tasks in MRI, CT, PET, and X-RAY pictures of medical organ scans with little data. For our experimental study, we employed two image datasets: 'Liver Tumor' and 'Gland Colon Cancer', both obtained from the Kaggle portal. This experimental setup includes an Open-Source segmentation model API. Our findings indicate that domain similarity-based transfer learning can be used to data-scarce sectors. We achieved 98.47% accuracy and a IoU score of 0.9891 on Liver Tumor data and 0.6956 accuracy and a IoU score of 0.7043 on gland colon dataset.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.