Abstract

Patient similarity learning has attracted great research interest in biomedical informatics. Correctly identifying the similarity between a given patient and patient records in the database could contribute to clinical references for diagnosis and medication. The sparsity of underlying relationships between patients poses difficulties for similarity learning, which becomes more challenging when considering real-world Electronic Health Records (EHRs) with a large number of missing values. In the paper, we organize EHRs as a graph and propose a novel deep learning framework, Structure-aware Siamese Graph neural Networks (SSGNet), to perform robust encounter-level patient similarity learning while capturing the intrinsic graph structure and mitigating the influence from missing values. The proposed SSGNet regards each patient encounter as a node, and learns the node embeddings and the similarity between nodes simultaneously via Graph Neural Networks (GNNs) with siamese architecture. Further, SSGNet employs a low-rank and contrastive objective to optimize the structure of the patient graph and enhance model capacity. The extensive experiments were conducted on two publicly available datasets and a real-world dataset regarding IgA nephropathy from Peking University First Hospital, in comparison with multiple baseline and state-of-the-art methods. The significant improvement in Accuracy, Precision, Recall and F1 score on the patient encounter pairwise similarity classification task demonstrates the superiority of SSGNet. The mean average precision (mAP) of SSGNet on the similar encounter retrieval task is also better than other competitors. Furthermore, SSGNet’s stable similarity classification accuracies at different missing rates of data validate the effectiveness and robustness of our proposal.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.