Abstract

Smart devices that operate in a shared environment with people need to be aligned with their values and requirements. We study the problem of multiple stakeholders informing the same device on what the right thing to do is. Specifically, we focus on how to reach a middle ground among the stakeholders inevitably incoherent judgments on what the rules of conduct for the device should be. We formally define a notion of middle ground and discuss the main properties of this notion. Then, we identify three sufficient conditions on the class of Horn expressions for which middle grounds are guaranteed to exist. We provide a polynomial time algorithm that computes middle grounds, under these conditions. We also show that if any of the three conditions is removed then middle grounds for the resulting (larger) class may not exist. Finally, we implement our algorithm and perform experiments using data from the Moral Machine Experiment. We present conflicting rules for different countries and how the algorithm finds the middle ground in this case.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.