Abstract

Contemporary information systems make widespread use of artificial intelligence (AI). While AI offers various benefits, it can also be subject to systematic errors, whereby people from certain groups (defined by gender, age, or other sensitive attributes) experience disparate outcomes. In many AI applications, disparate outcomes confront businesses and organizations with legal and reputational risks. To address these, technologies for so-called “AI fairness” have been developed, by which AI is adapted such that mathematical constraints for fairness are fulfilled. However, the financial costs of AI fairness are unclear. Therefore, the authors develop AI fairness for a real-world use case from e-commerce, where coupons are allocated according to clickstream sessions. In their setting, the authors find that AI fairness successfully manages to adhere to fairness requirements, while reducing the overall prediction performance only slightly. However, they find that AI fairness also results in an increase in financial cost. Thus, in this way the paper’s findings contribute to designing information systems on the basis of AI fairness.

Highlights

  • Contemporary information systems make widespread use of artificial intelligence (AI)

  • We study the financial costs of AI fairness in e-commerce due to several reasons

  • AI might introduce disparate outcomes for users depending on certain sociodemographics, such as gender

Read more

Summary

Introduction

Contemporary information systems make widespread use of artificial intelligence (AI). AI can lead to disparate outcomes for people according to certain sociodemographics (gender, race, or other attributes deemed sensitive) In this case, AI1 may lead to discrimination (Barocas and Selbst 2016). In e-commerce, AI is utilized to personalize website interactions, and yet it has been found that AI systems show significantly fewer advertisements for high-paying jobs to women than to men (Datta et al 2015; Lambrecht and Tucker 2019). This could limit women’s access to resources or hinder economic advances. We use the following notation: we refer to the predicted label as Y^, the actual label as Y, and the sensitive attribute as A

Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call