Abstract

Information can be understood as that which reduces uncertainty, no matter what origin it has. In the field of human communication, information is only meaningful if it is part of a finished or intentional action. Meaning should be gathered from the empirical perspective of the use of language. If we study the processing of signification through transmission of the normal use of language, we will see that it takes place communicating a set of prototype categories, the core or central facts, which defines meaning as empirical hypothesis. But if there are central facts showing the use of words, then other facts –more or less peripheral– should also exit, whose knowledge is necessary in order to communicate in contexts far away from the “denotative conceptual norm”. Hence meaning can be represented by a fuzzy subset of the universe of discourse partition set. This concept of meaning may be integrated in a formal model of semantic source and information may be measured by non-probabilistic entropy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call