Abstract

When humans observe other agents, one key aspect of that agent's behavior that they expect is intentionality, i.e. that the agent is working towards some goal and committed to achieving it. It is therefore desirable to develop agents that exhibit such behavior when they are supposed to interact with humans, especially if communication between the agent and the human is involved. However, as Cohen and Levesque have noted, intention can not be viewed in a vacuum, because it is tightly linked with an agent's beliefs about the world, and that effect is magnified when communication is involved. For my thesis, I am planning an agent framework that exhibits intentional behavior by modeling the agent's beliefs about the world and other agents' beliefs, in a theory of mind. My work not only focuses on having agents act with intentions, but also how they can communicate these intentions to other agents, and even deceive other agents by communicating intentions that they don't actually hold. Therefore, I will be focusing on what I call epistemic games, which are (turn-based) games in which the acquisition and exchange of knowledge is an intrinsic part of game play.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call