Bounds are obtained for the Kullback-Leibler discrimination distance between two random vectorsX andY. IfX is a sequence of independent random variables whose densities have similar tail behavior andY=AX, whereA is an invertible matrix, then the bounds are a product of terms depending onA andX separately. We apply these bounds to obtain the best possible rate of convergence for any estimator of the parameters of an autoregressive process with innovations in the domain of attraction of a stable law. We provide a general theorem establishing the link between total variation proximity of measures and the rate of convergence of statistical estimates to complete the exposition for this application.
Read full abstract