What is required to allow an artificial agent to engage in rich, human-like interactions with people? I argue that this will require capturing the process by which humans continually create and renegotiate 'bargains' with each other. These hidden negotiations will concern topics including who should do what in a particular interaction, which actions are allowed and which are forbidden, and the momentary conventions governing communication, including language. Such bargains are far too numerous, and social interactions too rapid, for negotiation tobe conducted explicitly. Moreover, the very process of communication presupposes innumerable momentary agreements concerning the meaning ofcommunicative signals, thus raising the threat of circularity. Thus, the improvised 'social contracts' that govern our interactions must be implicit. I draw on the recent theory of virtual bargaining, according to which social partners mentally simulate a process of negotiation, to outline how these implicit agreements can be made, and note that this viewpoint raises substantial theoretical and computational challenges. Nonetheless, I suggest that these challenges must be met if we are ever to create AI systems that can work collaboratively alongside people, rather than serving primarily as valuable special-purpose computational tools. This article is part of a discussion meeting issue 'Cognitive artificial intelligence'.
Read full abstract