Abstract
The rapid and precise segmentation of cell nuclei from hematoxylin and eosin-stained tissue images is an essential clinical undertaking with significant implications for various clinical applications. The segmentation of cell nuclei poses specific challenges due to the inherent instability of nuclear morphology and the complexity of the segmentation environments. Furthermore, previous studies have primarily relied on small-scale and limited-diverse datasets, potentially hindering their applicability to clinical tasks. This study introduces a novel approach, the Double-stage Codec Attention Network, designed to automatically and accurately segment nuclei. Specifically, we present a hierarchical feature extraction module, which maximizes the utilization of cell nuclei’s morphological characteristics in the tissue, thereby providing critical semantic information for nucleus segmentation. Furthermore, the feature selection units are employed to enhance relevant features and suppress interfering ones, thereby enhancing the overall expressive capacity of the information. The multi-scale deep feature fusion module utilizes interrelated encoder–decoder connections to jointly optimize and integrate this information, generating a robust hierarchical feature pyramid. Finally, the feature attention fusion mechanism captures spatial and directional information, aiding the model in the accurate localization and recognition of cell nuclei. We rigorously evaluated our proposed method using the PanNuke dataset, the largest comprehensive histology dataset of cancer tissues. In terms of the average F1-score across all segmentation classes in the PanNuke dataset, DSCA-Net outperforms other state-of-the-art models such as DeepLabV3+, TransUNet, Triple U-net, and TransNuSeg by 1.38, 1.44, 2.64, and 1.02, respectively. Additionally, DSCA-Net shows excellent efficiency in generating predictive images, outperforming all comparative models.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.