Abstract

Electrocardiogram (ECG) is a universal diagnostic tool for heart disease, which can provide data for deep learning. The scarcity of labeled data is a major challenge for medical artificial intelligence diagnosis. Acquiring labeled medical data is time-consuming and high-cost because medical specialists are needed. As a kind of generative self-supervised learning method, a masked autoencoder (MAE) is capable to solve these problems. MAE family of ECG (MaeFE) is proposed in this article. Considering the temporal and spatial features of ECG, MaeFE contains three customized masking modes, including masked time autoencoder (MTAE), masked lead autoencoder (MLAE), and masked lead and time autoencoder (MLTAE). MTAE and MLAE pay greater attention to temporal features and spatial features, respectively. MLTAE is a multihead architecture that combines MTAE and MLAE. In the pretraining stage, ECG signals from the pretrain dataset are divided into patches and partially masked. The encoder transfers unmasked patches to tokens and the decoder reconstructs masked ones. In downstream tasks, the pretrained encoder is utilized as a classifier, which is arrhythmia classification performed in the downstream dataset. The process is the so-called transfer learning. MaeFE outperforms the state-of-the-art self-supervised learning methods, SimCLR, MoCo, CLOCS, and MaskUNet in downstream tasks. MTAE has the best comprehensive performance. Compared to contrastive learning models, MTAE achieves at least a 5.18%, 11.80%, and 3.23% increase in accuracy (Acc), Macro-F1, and area under the curve (AUC), respectively, using the linear probe. It also outperforms other models at 8.99% in Acc, 20.18% in Macro-F1, and 7.13% in AUC using fine-tuning. As another downstream task, experiments on the multilabel classification of arrhythmia are also conducted, which reflects the excellent generalization performance of MaeFE. Depending on experimental results, MaeFE turns out to be efficient and robust in downstream tasks. Overcoming the scarcity of labeled data, MaeFE is better than other self-supervised learning methods and achieves satisfying performance. Consequently, the algorithm in this article is on track of playing a major role in practical applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call