Statistical divergences are important tools in data analysis, information theory, and statistical physics, and there exist well-known inequalities on their bounds. However, in many circumstances involving temporal evolution, one needs limitations on the rates of such quantities instead. Here, several general upper bounds on the rates of some f-divergences are derived, valid for any type of stochastic dynamics (both Markovian and non-Markovian), in terms of information-like and/or thermodynamic observables. As special cases, the analytical bounds on the rate of mutual information are obtained. The major role in all those limitations is played by temporal Fisher information, characterizing the speed of global system dynamics, and some of them contain entropy production, suggesting a link with stochastic thermodynamics. Indeed, the derived inequalities can be used for estimation of minimal dissipation and global speed in thermodynamic stochastic systems. Specific applications of these inequalities in physics and neuroscience are given, which include the bounds on the rates of free energy and work in nonequilibrium systems, limits on the speed of information gain in learning synapses, as well as the bounds on the speed of predictive inference and learning rate. Overall, the derived bounds can be applied to any complex network of interacting elements, where predictability and thermodynamics of network dynamics are of prime concern.
Read full abstract