Abstract

Accurate diagnosing of pneumonia and recognizing Anteroposterior and Posteroanterior in pediatric patients through chest X-ray (CXR) are critical components of contemporary computer-aided medical imaging systems. However, a prevalent issue in current methodologies is the suboptimal representation due to class imbalance, which is particularly pronounced in medical images, given the limited quantitative data for certain classes. Furthermore, despite the promising performance of deep learning models, they often face challenges when confronted with various manifestations of pediatric pneumonia and other lung diseases, aside from pneumonia itself. Thus, enhancing model generalization becomes imperative. To address these limitations, this paper presents a novel approach that combines weight adaptive contrast learning and knowledge governor distillation, namely SEACC. Firstly, we employ self-knowledge distillation, leveraging sample-level soft targets to enhance the model’s generalization without incurring additional computational overhead. Secondly, we incorporate supervised contrastive learning, introducing innovative contrast weights among positive samples based on feature similarity. This technique promotes feature representations that exhibit intra-class compactness and inter-class dispersion. Experimental evaluations are conducted on a publicly Guangzhou Women and Children’s Medical dataset, showing an improvement in Recall by ↑3.3%, in AUC by ↑10.1%, surpassing the existing SOTA methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call