Abstract

In this article we seek to systematically explore and understand crucial aspects of a potential dark side of personalized transactions. Big data and artificial intelligence may enable businesses with access to the data and the required technology to effectively personalize their interactions with consumers in order to exploit informational asymmetries and/or consumer biases in novel ways and on an unprecedented scale. We identify three aspects of the dark side of personalized B2C transactions as particular areas of concern. First, businesses increasingly engage in first-degree price discrimination, siphoning rents from consumers. Second, firms exploit well-known behavioral biases of consumers such as, for example, their inability to correctly assess the long-term effects of complex transactions or their insufficient will-power, in a systematic fashion. And third, businesses use microtargeted ads and recommendations to shape consumers’ preferences and steer them into a particular consumption pattern, effectively locking them into a lifestyle determined by their past choices and those of likeminded fellows. At first sight, siphoning rents, exploiting biases and shaping preferences appear to be relatively distinct phenomena. However, on closer inspection, these phenomena share a common underlying theme: the potential exploitation of consumers or at least an impoverishment of their lives by firms who apply novel and sophisticated technological means to maximize profits. Hence, the dark side of personalized B2C transactions may be characterized as consumers being “brought down by algorithms”, losing transaction surplus, engaging in welfare-reducing transactions and increasingly being trapped in a narrower life. It is unclear whether first-degree price discrimination creates an efficiency problem, but surely it raises concerns of distributive justice. We propose that it should be addressed by a clear and simple warning to the consumer that she is being offered a personalized price and, in addition, a right to indicate that she does not want to participate in a personalized pricing scheme. Similarly, behavioral biases may or may not lead consumers to conclude inefficient transactions. But it appears that they should be given an opportunity to reflect on their choices if these have been induced by firms applying exploitative algorithmic sales techniques. Hence, we propose that consumers should have a right to withdraw from a contract concluded under such conditions. Indeed, in many jurisdictions they already have such a right today. Finally, shaping consumers’ preferences by microtargeted ads and recommendations prevents consumers from experimenting and leading a multifaceted life. We should have a right to opt out of the technological steering mechanisms created and utilized by firms that impoverish our lives.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.