The classical Berry–Esseen error bound, for the normal approximation to the law of a sum of independent and identically distributed random variables, is here improved by replacing the standardised third absolute moment with a weak norm distance to normality, using Zolotarev’s ζ \zeta norms. We thus sharpen and simplify two results of Ul’yanov (1976) and of Senatov (1998), each of them previously optimal, in the line of research initiated by Zolotarev (1965) and Paulauskas (1969). Our proof is based on a seemingly incomparable normal approximation theorem of Zolotarev (1986), combined with our main technical result: The Kolmogorov distance (supremum norm of difference of distribution functions) between a convolution of two laws and a convolution of two Lipschitz laws is bounded homogeneously of degree 1 in the pair of the Kantorovich distances (often called Wasserstein distances, the L 1 ^1 norms of differences of distribution functions) of the corresponding factors, and also in the pair of the Lipschitz constants. Side results include a short introduction to ζ \zeta norms on the real line, simpler inequalities for various probability distances, slight improvements of the theorem of Zolotarev (1986) and of a lower bound theorem of Bobkov, Chistyakov and Götze (2012), an application to sampling from finite populations, auxiliary results on rounding and on winsorisation, and computations of a few examples. The introductory section in particular is aimed at analysts in general rather than specialists in probability approximations.
Read full abstract