Abstract

The spatial correlation among different tissue components is an essential characteristic for diagnosis of breast cancers based on histopathological images. Graph convolutional network (GCN) can effectively capture this spatial feature representation, and has been successfully applied to the histopathological image based computer-aided diagnosis (CAD). However, the current GCN-based approaches need complicated image preprocessing for graph construction. In this work, we propose a novel CAD framework for classification of breast histopathological images, which integrates both convolutional neural network (CNN) and GCN (named CNN-GCN) into a unified framework, where CNN learns high-level features from histopathological images for further adaptive graph construction, and the generated graph is then fed to GCN to learn the spatial features of histopathological images for the classification task. In particular, a novel clique GCN (cGCN) is proposed to learn more effective graph representation, which can arrange both forward and backward connections between any two graph convolution layers. Moreover, a new group graph convolution is further developed to replace the classical graph convolution of each layer in cGCN, so as to reduce redundant information and implicitly select superior fused feature representation. The proposed clique group GCN (cgGCN) is then embedded in the CNN-GCN framework (named CNN-cgGCN) to promote the learned spatial representation for diagnosis of breast cancers. The experimental results on two public breast histopathological image datasets indicate the effectiveness of the proposed CNN-cgGCN with superior performance to all the compared algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.