Abstract

Objective. Nuclei segmentation is crucial for pathologists to accurately classify and grade cancer. However, this process faces significant challenges, such as the complex background structures in pathological images, the high-density distribution of nuclei, and cell adhesion. Approach. In this paper, we present an interactive nuclei segmentation framework that increases the precision of nuclei segmentation. Our framework incorporates expert monitoring to gather as much prior information as possible and accurately segment complex nucleus images through limited pathologist interaction, where only a small portion of the nucleus locations in each image are labeled. The initial contour is determined by the Voronoi diagram generated from the labeled points, which is then input into an optimized weighted convex difference model to regularize partition boundaries in an image. Specifically, we provide theoretical proof of the mathematical model, stating that the objective function monotonically decreases. Furthermore, we explore a postprocessing stage that incorporates histograms, which are simple and easy to handle and prevent arbitrariness and subjectivity in individual choices. Main results. To evaluate our approach, we conduct experiments on both a cervical cancer dataset and a nasopharyngeal cancer dataset. The experimental results demonstrate that our approach achieves competitive performance compared to other methods. Significance. The Voronoi diagram in the paper serves as prior information for the active contour, providing positional information for individual cells. Moreover, the active contour model achieves precise segmentation results while offering mathematical interpretability.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.