Abstract
This study considers the cumulative residual entropy (CRE)-based goodness of fit (GOF) test for location-scale time series models. The CRE-based GOF test for iid samples is introduced and the asymptotic behavior of the CRE-based GOF test and its bootstrap version is investigated for location-scale time series models. In particular, the influence of change points on the GOF test is studied through Monte Carlo simulations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have