In the realm of medical image analysis, where challenges like high class imbalance, inter-class similarity, and intra-class variance are prevalent, knowledge distillation has emerged as a powerful mechanism for model compression and regularization. Existing methodologies, including label smoothening, contrastive learning, and relational knowledge transfer, aim to address these challenges but exhibit limitations in effectively managing either class imbalance or intricate inter and intra-class relations within input samples. In response, this paper introduces StAlK (Structural Alignment based Self Knowledge distillation) for Medical Image Classification, a novel approach which leverages the alignment of complex high-order discriminative features from a mean teacher model. This alignment enhances the student model’s ability to distinguish examples across different classes. StAlK demonstrates superior performance in scenarios involving both inter and intra-class relationships and proves significantly more robust in handling class imbalance compared to baseline methods. Extensive investigations across multiple benchmark datasets reveal that StAlK achieves a substantial improvement of 6%–7% in top-1 accuracy compared to various state-of-the-art baselines. The code is available at: https://github.com/philsaurabh/StAlK_KBS.