Recently, deep learning-based models are widely employed for electrocardiogram (ECG) classification. However, classifying long-term ECGs, which contain vast amounts of data, is challenging. Due to the limitation of memory with respect to the original data size, preprocessing techniques such as resizing or cropping are often applied, leading to information loss. Therefore, introducing multi-instance learning (MIL) to address long-term ECG classification problems is crucial. However, a major drawback of employing MIL is the destruction of sample integrity, which consequently hinders the interaction among instances. To tackle this challenge, we proposed a multimodal MIL neural network named CIMIL, which consists of three key components: an instance interactor, a feature fusion method based on attention mechanisms, and a multimodal contrastive instance loss. First, we designed an instance interactor to improve the interaction and keep continuity among instances. Second, we proposed a novel feature fusion method based on attention mechanisms to effectively aggregate multimodal instance features for final classification, which selects key instances within each class, not only enhances the performance of our model but also reduces the number of parameters. Third, a multimodal contrastive instance loss is proposed to enhance the model’s ability to distinguish positive and negative multimodal instances. Finally, we evaluated CIMIL on both intrapatient and interpatient patterns of two commonly used ECG datasets. The experimental results show that the proposed CIMIL outperforms existing state-of-the-art methods on long-term ECG tasks.
Read full abstract