Abstract

Recent years have brought both a notable rise in the ability to efficiently harvest vast amounts of information, and a concurrent effort in preserving and actually enforcing the privacy of patients and their related data, as evidenced by the European GDPR. In these conditions, the Distributed Learning Ecosystem has shown great potential in allowing researchers to pool the huge amounts of sensitive data need to develop and validate prediction models in a privacy preserving way and with an eye towards personalized medicine. The aim of this abstract is to propose a privacy-preserving strategy for measuring the performance of Cox Proportional Hazard (PH) model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call