Data is an important business asset that can be valued and shared for different business purposes. Data about users of services, social platforms, marketplaces, and smart devices owners is used for a variety of purposes: analytics and service improvement, hypothesis testing, and new product development, partnerships, profiling and personalized advertising, machine learning, and the development of new technologies such as IoT. Data is not only used in the usual context of IT giants, but also in such economic sectors as agriculture and industry.1 The amount of data being produced is growing exponentially.2 The commercial turnover of information is expected to grow to an estimated $549 billion by 2028 for the Big Data market alone.3 At the same time, the use of data concerning a natural person falls under the human rights regime aiming to protect the fundamental right to privacy. That is why the data protection regulations historically attempted to reconcile the competing values of the free flow of data and individual’s privacy.4 Nowadays, the pervasive use of large volumes of information which include personal data makes this an even harder endeavour and creates new challenges for data protection legislation, since it is necessary to foster innovation based on the data economy and at the same time face the increasing threats to the privacy of individuals. In theory, de-identification5 which dilutes the link between data and individual can successfully reconcile these goals. For this reason, the processing of depersonalized data justly deserves more freedom than the processing of personal data. Depersonalization has become a widespread practice and in many cases is subject to regulation. However, there is an inverse relationship between the degree of data depersonalization and its value to business, as the more accurate and voluminous the data on people is, the more patterns can be drawn from it.
Read full abstract