Abstract

AbstractIn this work, we provide a logical characterization of trust, which is based on a modal logic expressing a computational notion of trust quantitatively dependent on the beliefs possessed by the agent. The proposed framework encompasses decidability results and equivalence laws emphasizing the properties of trust. The overall aim is to obtain a formal notion of trust that could be employed for further developments of formal languages related to decision-making procedures and soft-security mechanisms in online, digital environments. Such formal counterpart of trust should support agents, either human or artificial, in devising secure decision strategies based on partial and/or indirect information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call