Today's digital information systems and applications collect every day a huge amount of personal health information (PHI) from sensor and surveillance systems, and every time we use personal computers or mobile phones. Collected data is processed in clouds, platforms and ecosystems by digital algorithms and machine learning. Pervasive technology, insufficient and ineffective privacy legislation, strong ICT industry and low political will to protect data subject's privacy have together made it almost impossible for a user to know what PHI is collected, how it is used and to whom it is disclosed. Service providers' and organizations' privacy policy documents are cumbersome and they do not guarantee that PHI is not misused. Instead, service users are expected to blindly trust in privacy promises made. In spite of that, majority of individuals are concerned of their privacy, and governments' assurance that they meet the responsibility to protect citizens in real life privacy is actually dead. Because PHI is probably the most sensitive data we have, and the authors claim it cannot be a commodity or public good, they have studied novel privacy approaches to find a way out from the current unsatisfactory situation. Based on findings got, the authors have developed a promising solution for privacy-enabled use of PHI. It is a combination of the concept of information fiduciary duty, Privacy as Trust approach, and privacy by smart contract. This approach shifts the onus of privacy protection onto data collectors and service providers. A specific information fiduciary duty law is needed to harmonize privacy requirements and force the acceptance of proposed solutions. Furthermore, the authors have studied strengths and weaknesses of existing or emerging solutions.
Read full abstract