Abstract

This study integrates information theory and extreme value theory to enhance the prediction of extreme events. Information-theoretic measures provide a foundation for model comparison in tails. The theoretical findings suggest that (1) the entropy of block maxima converges to the entropy of the generalized extreme value distribution, (2) the rate of convergence is controlled by its shape parameter, and (3) the entropy of block maxima is a monotonically decreasing function of the block size. Empirical analysis of E-mini S&P, 500 futures data evaluates the financial risk, capturing information content of extreme events using entropy and Kullback–Leibler divergence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call