Abstract

For a number of years, facial features removal techniques such as ‘defacing’, ‘skull stripping’ and ‘face masking/blurring’, were considered adequate privacy preserving tools to openly share brain images. Scientifically, these measures were already a compromise between data protection requirements and research impact of such data. Now, recent advances in machine learning and deep learning that indicate an increased possibility of re-identifiability from defaced neuroimages, have increased the tension between open science and data protection requirements. Researchers are left pondering how best to comply with the different jurisdictional requirements of anonymization, pseudonymisation or de-identification without compromising the scientific utility of neuroimages even further. In this paper, we present perspectives intended to clarify the meaning and scope of these concepts and highlight the privacy limitations of available pseudonymisation and de-identification techniques. We also discuss possible technical and organizational measures and safeguards that can facilitate sharing of pseudonymised neuroimages without causing further reductions to the utility of the data.

Highlights

  • Advances in imaging technology have led to significant changes in the nature, size and variety of neuroimages collected, processed, stored and shared

  • This paper makes contributions to both research and practice: it improves legal and technical understanding of anonymization/pseudonymization/de-identification of neuroimages and identifies legal positions on how neuroimages can be shared without overemphasizing the total removal of direct and indirect identifiers to preserve scientific utility

  • A de-anonymization attack conducted by Ravindra and Grama (2019) which relied on novel techniques of analysis, revealed the theoretical possibility of re-identifying the individual subjects, and the tasks they were performing during the scan and potentially other corresponding patient data like progression of disease, behavioral traits, sex, gender and contact details

Read more

Summary

Introduction

Advances in imaging technology have led to significant changes in the nature, size and variety of neuroimages collected, processed, stored and shared. This paper makes contributions to both research and practice: it improves legal and technical understanding of anonymization/pseudonymization/de-identification of neuroimages and identifies legal positions on how neuroimages can be shared without overemphasizing the total removal of direct and indirect identifiers to preserve scientific utility. These insights are of interest to individual researchers, funding agencies, platform providers, local/(inter)national policy makers and members of institutional review boards

Neuroimages
Uniqueness of neuroimages
Pseudonymization of neuroimages: scientific utility and open sharing
Impact on scientific utility of neuroimages
Privacy concerns
Open sharing predicated on ‘anonymization’
Responsible sharing of neuroimages
Pseudonymization
Encryption
Proposed computational solutions
Conclusion
Findings
Declarations of interest
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call