Abstract

The creation of artificial moral systems requires making difficult choices about which of varying human value sets should be instantiated. The industry-standard approach is to seek and encode moral consensus. Here the authors' argue, based on evidence from empirical psychology, that encoding current moral consensus risks reinforcing current norms, and thus inhibiting moral progress. However, so do efforts to encode progressive norms. Machine ethics is thus caught between a rock and a hard place. The problem is particularly acute when progress beyond prevailing moral norms is particularly urgent, as is currently the case due to the inadequacy of prevailing moral norms in the face of the climate and ecological crisis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call