Abstract
In this paper, we propose a trust model to be used in a hypothetical mixed environment where humans and unmanned vehicles cooperate. We address the inclusion of emotions inside a trust model in a coherent way to investigate the practical approaches to current psychological theories. The most innovative contribution of this work is the elucidation of how privacy issues play a role in the cooperation decisions of the emotional trust model. Both emotions and trust were cognitively modeled and managed with the beliefs, desires and intentions (BDI) paradigm in autonomous agents implemented in GAML (the programming language of the GAMA agent platform), that communicate using the IEEE FIPA standard. The trusting behavior of these emotional agents was tested in a cooperative logistics problem wherein agents have to move objects to destinations and some of the objects and places are associated with privacy issues. Simulations of the logistic problem show how emotions and trust contribute to improving the performance of agents in terms of both time savings and privacy protection.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.