Abstract

We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely determine Demon functional thermodynamic operating regimes, when previous methods either misclassify or simply fail due to approximations they invoke. This reveals that these Demons are more functional than previous candidates. They too behave either as engines, lifting a mass against gravity by extracting energy from a single heat reservoir, or as Landauer erasers, consuming external work to remove information from a sequence of binary symbols by decreasing their individual uncertainty. Going beyond these, our Demon exhibits a new functionality that erases bits not by simply decreasing individual-symbol uncertainty, but by increasing inter-bit correlations (that is, by adding temporal order) while increasing single-symbol uncertainty. In all cases, but especially in the new erasure regime, exactly accounting for informational correlations leads to tight bounds on Demon performance, expressed as a refined Second Law of thermodynamics that relies on the Kolmogorov–Sinai entropy for dynamical processes and not on changes purely in system configurational entropy, as previously employed. We rigorously derive the refined Second Law under minimal assumptions and so it applies quite broadly—for Demons with and without memory and input sequences that are correlated or not. We note that general Maxwellian Demons readily violate previously proposed, alternative such bounds, while the current bound still holds. As such, it broadly describes the minimal energetic cost of any computation by a thermodynamic system.

Highlights

  • The Second Law of Thermodynamics is only statistically true: while the entropy production in any process is nonnegative on the average, ∆S ≥ 0, if we wait long enough, we shall see individual events for which the entropy production is negative

  • While the system’s instantaneous distributions relax and change over time, the information reservoir itself is not allowed to build up and store memory or correlations. Note that this framework differs from alternative approaches to the thermodynamics of information processing, including: (i) active feedback control by external means, where the thermodynamic account of the Demon’s activities tracks the mutual information between measurement outcomes and system state [20,21,22,23,24,25,26,27,28,29,30,31,32,33]; (ii) the multipartite framework where, for a set of interacting, stochastic subsystems, the Second Law is expressed via their intrinsic entropy production, correlations among them, and transfer entropy [34,35,36,37]; and (iii) steady-state models that invoke time-scale separation to identify a portion of the overall entropy production as an information current [38, 39]

  • Thermodynamic systems that include information reservoirs as well as thermal and work reservoirs are an area of growing interest, driven in many cases by biomolecular chemistry or nanoscale physics and engineering

Read more

Summary

INTRODUCTION

The Second Law of Thermodynamics is only statistically true: while the entropy production in any process is nonnegative on the average, ∆S ≥ 0, if we wait long enough, we shall see individual events for which the entropy production is negative. While the system’s instantaneous distributions relax and change over time, the information reservoir itself is not allowed to build up and store memory or correlations Note that this framework differs from alternative approaches to the thermodynamics of information processing, including: (i) active feedback control by external means, where the thermodynamic account of the Demon’s activities tracks the mutual information between measurement outcomes and system state [20,21,22,23,24,25,26,27,28,29,30,31,32,33]; (ii) the multipartite framework where, for a set of interacting, stochastic subsystems, the Second Law is expressed via their intrinsic entropy production, correlations among them, and transfer entropy [34,35,36,37]; and (iii) steady-state models that invoke time-scale separation to identify a portion of the overall entropy production as an information current [38, 39]. This new thermodynamic function provocatively suggests why real-world ratchets support memory: The very functioning of memoryful Demons relies on leveraging temporally correlated fluctuations in their environment

INFORMATION RATCHETS
ENERGETICS AND DYNAMICS
INFORMATION
THERMODYNAMIC FUNCTIONALITY
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call