Abstract

In this paper, we present a UNet architecture-based deep learning method that is used to segment polyp and instruments from the image data set provided in the MedAI Challenge2021. For the polyp segmentation task, we developed a UNet based algorithm for segmenting polyps in images taken from endoscopies. The main focus of this task is to achieve high segmentation metrics on the supplied test dataset. Similarly for the polyp segmentation task, in the instrument segmentation task, we have developed UNet based algorithms for segmenting instruments present in colonoscopy videos.

Highlights

  • The 2021 MedAI Challenge [3] focuses on three tasks based on input from experts in the field to address unique gastrointestinal picture segmentation issues

  • The task of categorising each pixel of an object of interest in medical images is known as medical image segmentation

  • Data Pre-Processing As the images in the KVASIR Polyp Dataset [1] and KVASIR Intruments Dataset [2] were of various shapes, we scaled them to 256 × 256 dimensions in order to train it

Read more

Summary

Introduction

A Fully Convolutional Network is an early Deep Learning (DL) architecture that was trained end-to-end for pixel-wise prediction for semantic segmentation tasks (FCN). Another prominent image segmentation architecture for pixel-wise prediction is U-Net, which is taught end-to-end. For the Instrument Segmentation Development Task, we were given 590 images and their corresponding masks in JPEG and PNG format. For the Poly Segmentation Development Task, we were given 1000 images and their corresponding masks in JPEG format. These datasets were provided for training purpose

Materials and methods
Model Parameters
Discussion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.