Abstract
When humans observe other agents, one key aspect of that agent's behavior that they expect is intentionality, i.e. that the agent is working towards some goal and committed to achieving it. It is therefore desirable to develop agents that exhibit such behavior when they are supposed to interact with humans, especially if communication between the agent and the human is involved. However, as Cohen and Levesque have noted, intention can not be viewed in a vacuum, because it is tightly linked with an agent's beliefs about the world, and that effect is magnified when communication is involved. For my thesis, I am planning an agent framework that exhibits intentional behavior by modeling the agent's beliefs about the world and other agents' beliefs, in a theory of mind. My work not only focuses on having agents act with intentions, but also how they can communicate these intentions to other agents, and even deceive other agents by communicating intentions that they don't actually hold. Therefore, I will be focusing on what I call epistemic games, which are (turn-based) games in which the acquisition and exchange of knowledge is an intrinsic part of game play.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.