Abstract

With the growing use of multimodal data for deep learning classification in healthcare research, more studies are presenting explainability methods for insight into multimodal classifiers. Among these studies, few utilize local explainability methods, which can provide (1) insight into the classification of samples over time and (2) better understanding of the effects of demographic and clinical variables upon patterns learned by classifiers. To the best of our knowledge, we present the first local explainability approach for insight into the importance of each modality to the classification of samples over time. Our approach uses ablation, and we demonstrate how it can show the importance of each modality to the correct classification of each class. We further present a novel analysis that explores the effects of demographic and clinical variables upon the multimodal patterns learned by the classifier. As a use-case, we train a convolutional neural network for automated sleep staging with electroencephalogram (EEG), electrooculogram (EOG), and electromyogram (EMG) data. We find that EEG is the most important modality across most stages, though EOG is particularly important for non-rapid eye movement stage 1. Further, we identify significant relationships between the local explanations and subject age, sex, and state of medication which suggest that the classifier learned features associated with these variables across multiple modalities and correctly classified samples. Our novel explainability approach has implications for many fields involving multimodal classification. Moreover, our examination of the degree to which demographic and clinical variables may affect classifiers could provide direction for future studies in automated biomarker discovery.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call