Abstract

We investigate the connection between the time-evolution of averages of stochastic quantities and the Fisher information and its induced statistical length. As a consequence of the Cramer-Rao bound, we find that the rate of change of the average of any observable is bounded from above by its variance times the temporal Fisher information. As a consequence of this bound, we obtain a speed limit on the evolution of stochastic observables: Changing the average of an observable requires a minimum amount of time given by the change in the average squared, divided by the fluctuations of the observable times the thermodynamic cost of the transformation. In particular for relaxation dynamics, which do not depend on time explicitly, we show that the Fisher information is a monotonically decreasing function of time and that this minimal required time is determined by the initial preparation of the system. We further show that the monotonicity of the Fisher information can be used to detect hidden variables in the system and demonstrate our findings for simple examples of continuous and discrete random processes.

Highlights

  • Information geometry [1] is a branch of information theory that describes information in terms of differential geometry

  • Applying information geometry to physics can be motivated by a question central to any physical experiment: Given a system described by a set of parameters, how much information about the system can we gain by observing its change under a variation of the parameters? As it turns out, the effect of smooth parameter variations defines a metric called the Fisher information metric [2,3,4,5]

  • We focus on the physical interpretation of Fisher information and its consequences on the time evolution of stochastic systems and observables

Read more

Summary

INTRODUCTION

Information geometry [1] is a branch of information theory that describes information in terms of differential geometry. The monotonicity of the Fisher information for relaxation processes has two further profound consequences: First, it results in a lower bound on the time required to relax a stochastic system from an initial to a final configuration, extending previously obtained speed limits for stochastic dynamics [21,30]. It can serve as an indicator for the presence of hidden variables in the system: If we observe an increase of the Fisher information during a relaxation process, this increase necessarily implies that we are missing some information about the system. We show that this discrepancy between observed and total information can be used to detect hidden degrees of freedom

INTRINSIC SPEED OF STOCHASTIC DYNAMICS
MONOTONICITY OF RELAXATION PROCESSES
THERMODYNAMIC INTERPRETATION OF FISHER INFORMATION
GEOMETRIC INTERPRETATION
General normal distributions
Brownian motion
Particle in a parabolic trap
DETECTION OF HIDDEN STATES
VIII. DISCUSSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call