Abstract

Modern digital electronics support remarkably reliable computing, especially given the challenge of controlling nanoscale logical components that interact in fluctuating environments. However, we demonstrate that the high-reliability limit is subject to a fundamental error–energy-efficiency tradeoff that arises from time-symmetric control: requiring a low probability of error causes energy consumption to diverge as the logarithm of the inverse error rate for nonreciprocal logical transitions. The reciprocity (self-invertibility) of a computation is a stricter condition for thermodynamic efficiency than logical reversibility (invertibility), the latter being the root of Landauer's work bound on erasing information. Beyond engineered computation, the results identify a generic error–dissipation tradeoff in steady-state transformations of genetic information carried out by biological organisms. The lesson is that computational dissipation under time-symmetric control cannot reach, and is often far above, the Landauer limit. In this way, time-asymmetry becomes a design principle for thermodynamically efficient computing.Received 14 September 2019Accepted 24 August 2020DOI:https://doi.org/10.1103/PhysRevResearch.2.033524Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.Published by the American Physical SocietyPhysics Subject Headings (PhySH)Research AreasFluctuation theoremsNonequilibrium & irreversible thermodynamicsNonequilibrium fluctuationsNonequilibrium statistical mechanicsSelf-organizationThermodynamics of computationCondensed Matter, Materials & Applied PhysicsNonlinear DynamicsStatistical PhysicsBiological Physics

Highlights

  • The thermodynamics of computation tells us that information processing can be achieved with zero energy dissipation if one has sufficient control over a system’s microscopic degrees of freedom and can endure the quasistatic limit of infinitely slow processing [1,2,3,4,5]

  • This state of affairs poses a grand challenge to thermodynamic computing: identify control protocols that reliably drive a system between memory states according to a desired computation in finite time and with minimal dissipation

  • We discovered that the time-reversal symmetries of the memory elements themselves can substantially alter the minimal thermodynamic costs

Read more

Summary

Introduction

The thermodynamics of computation tells us that information processing can be achieved with zero energy dissipation if one has sufficient control over a system’s microscopic degrees of freedom and can endure the quasistatic limit of infinitely slow processing [1,2,3,4,5]. Though, computation must be performed in finite time. This requires additional work and guarantees the investment is lost via dissipation. This state of affairs poses a grand challenge to thermodynamic computing: identify control protocols that reliably drive a system between memory states according to a desired computation in finite time and with minimal dissipation. Reliable computation could be performed with arbitrarily little dissipation at the cost of arbitrarily slow processing

Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.