Abstract
In statistical physics entropy is usually introduced as a global quantity which expresses the amount of information that would be needed to specify the microscopic configuration of a system. However, for lattice models with infinitely many possible configurations per lattice site it is also meaningful to introduce entropy as a local observable that describes the information content of a single lattice site. Likewise, the mutual information between two sites can be interpreted as a two-point correlation function which quantifies how much information a lattice site has about the state of another one and vice versa. Studying a particular growth model we demonstrate that the mutual information exhibits scaling properties that are consistent with the established phenomenological scaling picture.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Statistical Mechanics: Theory and Experiment
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.