We consider the stability of the functional inequalities concerning the entropy functional. For the Boltzmann–Shannon entropy, the logarithmic Sobolev inequality holds as a lower bound of the entropy by the Fisher information, and the Heisenberg uncertainty principle follows from combining it with the Shannon inequality. The optimizer for these inequalities is the Gauss function, which is a fundamental solution to the heat equation. In the fields of statistical mechanics and information theory, the Tsallis entropy is known as a one-parameter extension of the Boltzmann–Shannon entropy, and the Wasserstein gradient flow of it corresponds to the quasilinear diffusion equation. We consider the improvement and stability of the optimizer for the logarithmic Sobolev inequality related to the Tsallis entropy. Furthermore, we show the stability results of the uncertainty principle concerning the Tsallis entropy.
Read full abstract