Abstract
In response to fears around the risky and irresponsible development of artificial intelligence (AI), the prevailing approach from states, intergovernmental organisations, and technology firms has been to roll out a ‘new’ vocabulary of ethics. This self-regulatory approach relies on top-down, broadly-stated ethics frameworks intended to moralise market dynamics and elicit socially responsible behaviour among top-end developers and users of AI software. At present, it remains an open question regarding how well these principles are understood and internalised by AI practitioners throughout the AI ecosystem. The promotion of AI ethics has so far proceeded with little input from this group, despite their essential role in choosing and applying this emerging ethical language and associated tools in their project designs and related decision-making. As AI principles shift from normative organisational guides to operational practice, this paper offers a methodology — a ‘shared fairness’ approach — aimed at addressing this gap. The goal of this method is to identify AI practitioners’ needs when it comes to confronting and resolving ethical challenges and to find a ‘third space’ where their operational language can be married with that of the more abstract principles that presently remain at the periphery of their work life. We offer a grassroots approach to operational ethics based on dialog and mutualised responsibility. This methodology is centred around conversations intended to elicit practitioners perceived ethical attribution and distribution over key value-laden operational decisions, to identify when these decisions arise and what ethical challenges they confront, and to engage in a language of ethics and responsibility which enables practitioners to internalise ethical responsibility. The methodology bridges responsibility imbalances that rest in structural decision-making power and elite technical knowledge, by commencing with personal, facilitated conversations, returning the ethical discourse to those meant to give it meaning at the sharp end of the ecosystem. By attending to practitioners, our project aims to better understand ethics as a socio-technical practice, progressing from the appreciation that as a realistic force in regulation, ethics are dynamic and interdependent.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.