Abstract

BackgroundThe identification of human epithelial type-2 mitotic cell patterns in the indirect immunofluorescence images (IIF HEp-2) is a critical step for autoimmune diseases computer-aided diagnosis (CAD) systems. Recognition of HEp-2 cells in the mitotic phase is clinically vital for validating the HEp-2 samples and assisting in diagnosing specimens with mixed patterns. Typically, mitotic cells are rarely observed in the HEp-2 specimen images, resulting in a significant skewness of the available medical datasets towards the majority of interphase (non-mitotic) patterns. MethodsIn this paper, a deep learning framework is proposed based on a self-attention deep cross-residual network with an efficient generative adversarial network (GAN). The proposed framework consists of two consecutive steps. First, the problem of imbalanced data of minority mitotic against the majority interphase cells is remedied using Info-WGANGP approach in combination with the conventional data augmentation methods. Second, a downstream end-to-end deep learning Att-DCRNet is developed to classify the mitotic and interphase cell patterns of the IIF HEp-2 medical images. A comprehensive experimental study is performed to validate the effectiveness of the proposed framework against other state-of-the-art methods over the public medical dataset of UQ-SNP_HEp-2 Task-3. ResultsThe proposed framework demonstrates competitive classification results attaining a maximum performance of 84.10% F1-score, 84.70% Matthew's correlation coefficient (MCC), and 99.0% balanced class accuracy (BcA), which proves its applicability for automatically supporting an accurate diagnosis decision regarding the HEp-2 mitotic and interphase cell patterns. The source code is available at this link: https://github.com/Anaam-dl/AttDCRNet_with_InfoWGANGP.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.