Abstract

An information-theoretic notion of entropy is proposed for a system ofN interacting particles which assesses an observer's limited knowledge of the state of the system, assuming that he or she can measure with arbitrary precision all one-particle observables and correlations involving some numberp of the particles but is completely ignorant of the form of any higher-order correlations involving more thanp particles. The idea is to define a generic measure of entropyS[\(\tilde \mu\)] = −Tr\(\tilde \mu\) log\(\tilde \mu\) for an arbitrary density matrix or distribution function\(\tilde \mu\), and then, given the “true”N-particleμ, to define a “reduced”μRP which reflects the observer's partial knowledge. The result, at any timet, is a chain of inequalitiesS[μR1]≥S[μR2]≥...≥S[μRN]≡S[μ], with true equalityS[μRp]=S[μRp+1] if and only if the trueμ factorizes exactly into a product of contributions involving all possiblep-particle groupings. It follows further than (1) if, at some initial timet0, the trueμ factorizes in this way, thenS[μRp(∼]≥S[μRp(t0)] for all finite timest>t0, with equality if and only if the factorization is restored, and (2) the initial response of the system must be to increase itsp-particle entropy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call