Abstract

This paper is about the relationship between AI technology and society in fundamental rights theory. In fundamental rights doctrine, the relationship between technology and society is seldom reflected. Legal practitioners tend to view technology as a black box. For scholars of science and technology studies (STS), similarly, the law is a closed book. Such reductionist or compartmentalised thinking in the law and social sciences must be overcome if a conceptualisation of AI technology in fundamental rights theory is to be successful. The paper offers a perspective on these issues that is based on a re-interpretation of affordance theory (as originally framed in STS). First, the question “how do affordances come into a technology?” is answered from the viewpoint of Bryan Pfaffenberger’s “technological drama”. Accordingly, the affordances (the possibilities and constraints of a technology) are shaped in a dialogue between a “design constituency” and an “impact constituency” in which the technology’s materiality and sociality are co-determined. Second, this theory is applied to study the co-determination of AI technology. Finally affordance theory is combined with socio-legal theorising that understands fundamental rights as social institutions bundling normative expectations about individual and social autonomies. How do normative expectations about the affordances of AI technology emerge and how are they constitutionalised?

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.