Abstract

Algorithmic information theory in conjunction with Landauer’s principle can quantify the cost of maintaining a reversible real-world computational system distant from equilibrium. As computational bits are conserved in an isolated reversible system, bit flows can be used to track the way a highly improbable configuration trends toward a highly probable equilibrium configuration. In an isolated reversible system, all microstates within a thermodynamic macrostate have the same algorithmic entropy. However, from a thermodynamic perspective, when these bits primarily specify stored energy states, corresponding to a fluctuation from the most probable set of states, they represent “potential entropy”. However, these bits become “realised entropy” when, under the second law of thermodynamics, they become bits specifying the momentum degrees of freedom. The distance of a fluctuation from equilibrium is identified as the number of computational bits that move from stored energy states to momentum states to define a highly probable or typical equilibrium state. When reversibility applies, from Landauer’s principle, it costs Joules to move a bit within the system from stored energy states to the momentum states.

Highlights

  • Mathematicians have developed algorithmic information theory (AIT), and the associated concept of Kolmogorov or algorithmic complexity, to measure the computational resources needed to describe an object in terms of the length of the shortest string of characters that represent the object

  • If a system exists in an improbable configuration outside the most probable equilibrium set of microstates, the thermodynamic entropy is said to increase as the system trends to equilibrium

  • This paper provides a conceptual framework that identifies how the computational behaviour of a real-world system determines its thermodynamic entropy providing a framework to analyse stable, non-equilibrium systems

Read more

Summary

Introduction

Mathematicians have developed algorithmic information theory (AIT), and the associated concept of Kolmogorov or algorithmic complexity, to measure the computational resources needed to describe an object in terms of the length of the shortest string of characters that represent the object. It can be seen that the algorithmic entropy, by focusing at the level of the microstate, provides a deterministic understanding of the manner in which the thermodynamic entropy increases as a system trends to equilibrium in terms of the computational processes involved. When ordering occurs, such as when magnetic spins align in a magnetic system, bits previously specifying random magnetic states become bits specifying momentum states raising the temperature. In order to avoid confusion, here the word “ordered” will be used to mean a low algorithmic entropy system that might be deemed to be complex in the intuitive sense, while the word “information” will be used in the mathematical sense to mean the number of bits needed to describe the system algorithmically

Formal Outline of the Algorithmic Entropy
Specifying the Shortest Algorithm
The Provisional Entropy
The Algorithmic Entropy of a Real-World Thermodynamic Macrostate
Perspectives of Real-World Computations
Why Bits Are Conserved
Net Entropy Flows and the Algorithmic Entropy
Non-Equilibrium States and Fluctuations within Equilibrium
The Trajectory Approach
Landauer’s Principle
The Principle
An Additional Perspective of the Algorithmic Approach
Landauer’s Principle and Resource Use of Natural Systems
Interpreting Bit Transfers in a Far-From-Equilibrium System
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call