Abstract

Pursuant to the principle ‘privacy by design’, the data controller must consider data protection aspects, and integrate appropriate measures both before and during data processing, to comply with the GDPR and protect the rights of data subjects. Although, as we presume, there might be a collision between this principle and the growingly popular psychological method of choice architecture. Taking advantage of the latter, more and more companies are using nudges to orientate users’ behaviour towards choices of their own interests. This could be claimed legal, but in fact, most of them consciously rely on the indefinite wording of the regulation to verify their far from fair practices. This conflict of privacy by design and nudging draws attention to a regulatory gap, which makes possible the violation of fundamental rights, since privacy and the freedom of choice are likely to be disregarded. In this paper we examine the currently used and prevailing compliance mechanisms on the market. Also, we seek to demonstrate how major companies can bend the definition of this fundamental principle, to disguise the usage of dark patterns and the abuse of privacy. Thereafter, we elaborate on the possible correlation of choice architecture and machine learning. In the last section we inspect which practices amount to the actual violence of fundamental rights, and we propose a possible advancement of privacy by design.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call