Abstract

Computational trust is the digital counterpart of the human notion of trust as applied in social systems. Its main purpose is to improve the reliability of interactions in online communities and of knowledge transfer in information management systems. Trust models are typically composed of two parts: a trust computing part and a trust manipulation part. The former serves the purpose of gathering relevant information and then use it to compute initial trust values; the latter takes the initial trust values as granted and manipulates them for specific purposes, like, e.g., aggregation and propagation of trust, which are at the base of a notion of reputation. While trust manipulation is widely studied, very little attention is paid to the trust computing part. In this paper, we propose a formal language with which we can reason about knowledge, trust and their interaction. Specifically, in this setting it is possible to put into direct dependence possessed knowledge with values estimating trust, distrust, and uncertainty, which can then be used to feed any trust manipulation component of computational trust models.

Highlights

  • Given the growing number of interactions in online communities and of information exchanges in information management systems, it’s becoming increasingly important to have security mechanisms that can prevent fraudulent behaviors

  • Computational trust is the digital counterpart of trust as applied in ordinary social communities and computational trust models are soft security mechanisms that implement the notion of trust in digital environments to increase the quantity and quality of interactions1

  • In this setting it is possible to put into direct dependence possessed knowledge with values estimating trust, distrust, and uncertainty, which can be used to feed the trust manipulation component of any computational trust models

Read more

Summary

Introduction

Given the growing number of interactions in online communities and of information exchanges in information management systems, it’s becoming increasingly important to have security mechanisms that can prevent fraudulent behaviors. Trust is one form of soft security that can be implemented into a system: trust is a social control mechanism that brings an undoubtedly positive impact on cooperative operations, both by increasing the chances of performing an interaction and by decreasing the chances of having malevolent behaviors during those interactions [1, 18]. Trust has both a proactive and a control effect over interactions.

Trust Computing and Trust Manipulation
Marsh’s Trust Model
Yu and Singh’s Trust Model
BDI + Repage
Summing Up
Subjective Logic
A Language for Trust
Syntax
Semantics
From Knowledge to Trust
Summing up
Example
Conclusion and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.