Abstract

Cell segmentation is the most significant task in microscopic image analysis as it facilitates differential cell counting and analysis of sub-cellular structures for diagnosing cytopathological diseases. Bright-field microscopy is considered the gold standard among different types of optical microscopes used for cell analysis due to its simplicity and cost-effectiveness. However, automatic cell segmentation in bright field microscopy is challenging due to imaging artifacts, poor contrast, overlapping cells, and wide variability of cells. Also, the availability of labeled bright-field images is limited, further constraining the research in developing supervised models for automated cell segmentation. In this research, we propose a novel cell segmentation framework termed Saliency and Ballness driven U-shaped Network (SBU-net) to overcome these challenges. The proposed architecture comprises a novel data-driven feature fusion module that enhances the perceivable structure of cells using its saliency and ballness features. This, together with an encoder–decoder model having dilated convolutions and a novel combination loss function, captured the global context of cell structures and produced accurate cell segmentation results. SBU-net is evaluated using two publicly available bright-field datasets of T cells and pancreatic cancer cells. The model is subjected to 5-fold cross-validation and outperformed state-of-the-art models by producing mean Intersection over Union (IoU) scores of 0.804, 0.829, and mean Dice of 0.891, 0.906, respectively. The architecture was also tested on a fluorescent dataset to see how well it could generalize, and it came out with a mean IoU of 0.892 and a mean Dice of 0.948, outperforming other models reported in the literature.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.