Abstract

Organizing systems should have a decreasing informational entropy, and a meaningful modelling of this property is provided by the approach via the maximum entropy principle, in which the value of the entropy so achieved decreases as the number of constraints increases. As a result, self‐organizing systems could be characterized by a property of self‐creation of constraints. The approach can be extended to probability distributions which depend on an external distributed parameter. In this same information theoretic framework, it appears that the concept of negentropy (negative entropy) introduced by physicists a few decades ago, can be directly identified with informational entropy of deterministic functions (which can be thought of as related to entropy of forms).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call