Abstract
We develop a new method for bounding the relative entropy of a random vector in terms of its Stein factors. Our approach is based on a novel representation for the score function of smoothly perturbed random variables, as well as on the de Bruijnʼs formula of information theory. When applied to sequences of functionals of a general Gaussian field, our results can be combined with the Carbery–Wright inequality in order to yield multidimensional entropic rates of convergence that coincide, up to a logarithmic factor, with those achievable in smooth distances (such as the 1-Wasserstein distance). In particular, our findings settle the open problem of proving a quantitative version of the multidimensional fourth moment theorem for random vectors having chaotic components, with explicit rates of convergence in total variation that are independent of the order of the associated Wiener chaoses. The results proved in the present paper are outside the scope of other existing techniques, such as for instance the multidimensional Steinʼs method for normal approximations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.