Abstract

Histopathological analysis is the gold standard for cancer diagnosis, but traditional manual identification methods are time-consuming, labor-intensive, and highly relied on the experience of pathologists. In recent years, deep learning methods can assist pathologists to achieve accurate and automated detection of lesion areas. However, lesion detection is a difficult task due to its various size, shape, and surrounding complex scenes. Existing methods ignore the global interaction between lesion area and normal area, so it is insufficient to extract important global features. In addition, most methods fail to effectively fuse deep and shallow features, resulting in the extracted features lacking discrimination, and these deficiencies lead to incomplete extraction and irregular boundaries of lesion. To address the above-mentioned issues, an end-to-end histopathological image segmentation network based on global context-aware module and deep feature aggregation (GCMDFA) is proposed for segmenting lesions in histopathological images (HIs). In the GCMDFA, a global context-aware module is applied to better explore the interactive relationship between any two pixels to directly capture global contextual information related to lesion tissue of HIs for achieving global features. To better decode the features of encoder, a deep feature aggregation structure is designed to effectively fuse deep and shallow layer features of HIs. Experimental results on three HI datasets (including public and external data) show that the proposed GCMDFA method is more competitive compared with some state-of-the-art (SOTA) methods. This research can assist pathologists to analyze HIs in clinical computer-aided diagnosis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.