Abstract

What is required to allow an artificial agent to engage in rich, human-like interactions with people? I argue that this will require capturing the process by which humans continually create and renegotiate 'bargains' with each other. These hidden negotiations will concern topics including who should do what in a particular interaction, which actions are allowed and which are forbidden, and the momentary conventions governing communication, including language. Such bargains are far too numerous, and social interactions too rapid, for negotiation tobe conducted explicitly. Moreover, the very process of communication presupposes innumerable momentary agreements concerning the meaning ofcommunicative signals, thus raising the threat of circularity. Thus, the improvised 'social contracts' that govern our interactions must be implicit. I draw on the recent theory of virtual bargaining, according to which social partners mentally simulate a process of negotiation, to outline how these implicit agreements can be made, and note that this viewpoint raises substantial theoretical and computational challenges. Nonetheless, I suggest that these challenges must be met if we are ever to create AI systems that can work collaboratively alongside people, rather than serving primarily as valuable special-purpose computational tools. This article is part of a discussion meeting issue 'Cognitive artificial intelligence'.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.