Abstract

With the advancing digitalization in almost all parts of our daily life, e.g., electronic health records and smart homes, and the outsourcing of data processing, e.g., data storage in the cloud and data analysis services, computer-based systems process more and more data these days. Often the processed data originate from natural persons (called data subjects) and are hence personal data possibly containing sensitive information about the individuals. Privacy in the context of personal data processing means that personal data are protected, e.g., against unwanted access and modification, that data subjects are aware about the processing practices of the controller that processes their data, and that data subjects keep control over the processing of their personal data. Privacy regulations, such as the EU General Data Protection Regulation (GDPR), aim at protecting data subjects by empowering them with rights and by putting obligations on controllers processing personal data. Not only administrative fines defined in regulations are a driver for the consideration of privacy in the development of a software-based system, also several data breaches occurred in the last years have shown that a poor consideration of privacy during the system and software development may ultimately lead to a loss of trust in and reputation of the controller. To avoid the occurrence of data breaches and to be compliant with privacy regulations, privacy should to be considered in system and software development as a software quality from the beginning. This approach is also known as privacy-by-design. There are several challenges for privacy-by-design methods that are still not fully addressed by existing methods. First, diverse notions of privacy exist. Most of these privacy notions are non-technical and have to be refined to more technical privacy requirements that can be related to the system. Second, the system has to be analyzed for its personal data processing behavior. That is, it has to be determined which personal data are collected, stored, and provided to others by the system. Third, the privacy requirements have to be elicited that are actually relevant for the system. Fourth, the privacy risks imposed by or existing in the system have to be identified and evaluated. Fifth, measures that implement the privacy requirements and mitigate the privacy risks of the system have to be selected and integrated into the system. Sixth, privacy regulations mandate to assess the impact of the personal data processing on the data subjects. Such a privacy impact assessment (PIA) may be performed as part of a privacy-by-design method. Seventh, the conduction of a privacy-by-design method should be supported as good as possible, e.g., by a systematic method, supportive material, and computer support. In this thesis, I propose the privacy requirements engineering method Problem-based Privacy Analysis (ProPAn). The ProPAn method aims to address the aforementioned challenges starting with a system's functional requirements as input. As part of ProPAn, I provide a privacy requirements taxonomy that I derived from and mapped to various other privacy notions. This privacy requirements taxonomy addresses the first challenge mentioned above. The ProPAn method is the main contribution of my thesis and addresses the second to seventh challenge mentioned above. To address the fifth challenge in the ProPAn method, I propose an aspect-oriented requirements engineering framework that allows to model cross-cutting functionalities and to modularly integrate them into a system's functional requirements. The seventh challenge is addressed by ProPAn's computer support for the execution of the method and the documentation and validation of the method's artifacts in a machine-readable model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call