Abstract

Automated nucleus segmentation is considered the gold standard for diagnosing some severe diseases. Accurate instance segmentation of nuclei is still very challenging because of the large number of clustered nuclei, and the different appearance of nuclei for different tissue types. In this paper, a neural network is proposed for fast and accurate instance segmentation of nuclei in histopathology images. The network is inspired by the Unet and residual nets. The main contribution of the proposed model is enhancing the classification accuracy of nuclear boundaries by moderately preserving the spatial features by relatively d the size of feature maps. Then, a proposed 2D convolution layer is used instead of the conventional 3D convolution layer, the core of CNN-based architectures, where the feature maps are first compacted before being convolved by 2D kernel filters. This significantly reduces the processing time and avoids the out of memory problem of the GPU. Also, more features are extracted when getting deeper into the network without degrading the spatial features dramatically. Hence, the number of layers, required to compensate the loss of spatial features, is reduced that also reduces the processing time. The proposed approach is applied to two multi-organ datasets and evaluated by the Aggregated Jaccard Index (AJI), F1-score and the number of frames per second. Also, the formula of AJI is modified to reflect the object- and pixel-level errors more accurately. The proposed model is compared to some state-of-the-art architectures, and it shows better performance in terms of the segmentation speed and accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.