Abstract
Electronic Health Record (EHR) is the digital form of patient visits containing various medical data, including diagnosis, treatment, and lab events. Representation learning of EHR with deep learning methods has been beneficial for patient-related prediction tasks. Recently, studies have focused on revealing the inherent graph structure between medical events in EHR. Graph neural network (GNN) methods are prevalent and perform well in various prediction tasks. However, the inherent relationships between various medical events must be marked, which is complicated and time-consuming. Most research works adopt the straightforward structure of GNN models on a single prediction task which could not fully exploit the potential of EHR representations. Compared with previous work, the multi-task prediction could utilize the latent information of concealed correlations between different prediction tasks. In addition, self-contrastive learning on graphs could improve the representation learned by GNN. We propose a multi-gate mixture of multi-view graph contrastive learning (MMMGCL) method, aiming to get a more reasonable EHR representation and improve the performances of downstream tasks. First, each patient visit is represented as a graph with a well-designed hierarchically fully-connected pattern. Second, node features in the manually constructed graph are pre-trained via the Glove method with hierarchical ontology knowledge. Finally, MMMGCL processes the pre-trained graph and adopts a joint learning strategy to simultaneously optimize task and contrastive losses. We verify our method on two large open-source medical datasets, Medical Information Mart for Intensive Care (MIMIC-III) and the eICU Collaborative Research Database (eICU). Experiment results show that our method could improve performance compared to straightforward graph-based methods on prediction tasks of patient readmission, mortality, and length of stay.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.