Abstract
The principles of transparency and explainability are landmarks of the current EU approach to artificial intelligence. Both are invoked in the policy guidelines as values governing algorithmic decision-making, while providing rationales for existing normative provisions, on information duties, access rights and control powers. This contribution addresses the debate on transparency and explainability from the EU consumer market perspective. The consumers’ position relative to algorithmic decision-making is considered, and their risks concerning mass surveillance, exploitation, and manipulation are discussed. The concept of algorithmic opacity is analyzed, distinguishing technology-based opacity that is intrinsic to design choices, from relational opacity toward users. The response of EU law is then considered. The emerging approach to algorithmic transparency and explainability is connected to the broader regulatory goals concerning transparency in consumer markets. It is argued that EU law focuses on adequate information being provided to lay consumers (exoteric transparency), rather than on understandability to experts (esoteric transparency). A discussion follows on the benefits of transparency, on its costs, and on the extent to which transparency can be implemented without affecting performance. Finally, the merits of a transparency-based regulation of algorithms are discussed and insights are provided on regulating transparency and explainability within the EU law paradigm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.