Abstract

In this paper we use the Kullback-Leibler divergence to measure the distance between the posteriors of the autoregressive (AR) model order, aiming to evaluate mathematically the sensitivity of the model identification to different types of priors of the model parameters. In particular, we consider three priors for the AR model coefficients, namely Jeffreys', g, and natural conjugate priors, and three priors for the model order including uniform, arithmetic, and geometric priors. Using a large number of Monte Carlo simulations with various values of the model coefficients, model order, and sample size, we evaluate the impact of the posteriors distance in the accuracy of the model identification. Simulation study results show that the posterior of the model order is sensitive to prior distributions, and the highest accuracy of the model identification is obtained from the posterior resulting from the g-prior. Same results are obtained from the application to real-world time series datasets.

Highlights

  • Prior specification plays an important role in the Bayesian analysis of time series models

  • We observe that the KL divergence and its calibration between ζj(p|y) and ζg1(p|y) and those between ζj(p|y) and ζg3(p|y) are very close, for example when n = 200 their calibration values are about 0.73. These divergences are strongly larger than those between ζj(p|y) and ζg2(p|y) and those between ζj(p|y) and ζn(p|y), which both have calibration values of about 0.6 for n = 200 as an example. The impact of these posteriors divergences can be observed in the percentage of correctly identified models presented in Table (1), since the results show that the highest percentage of correctly identified models is obtained from the two posteriors resulting from the g-prior with g = 1/n and kp/n followed by the percentage obtained from the two posteriors resulting from the g-prior with g = p/n and from the natural conjugate prior, and the lowest percentage is obtained from the posterior resulting from the Jeffreys’ prior

  • The posteriors of the model order resulting from the employed priors for the model coefficients are strongly different, and the highest percentage of identified models is obtained from the posterior resulting from the g-prior distribution

Read more

Summary

Introduction

Prior specification plays an important role in the Bayesian analysis of time series models. Ayman A Amin (Fan and Yao, 2009), multivariate autoregressive models (Shaarawy and Ali, 2008), and multivariate moving average models (Shaarawy and Ali, 2012) These researchers have employed one or more of the abovementioned prior distributions to derive the posterior mass function of the model order, none of them has evaluated the sensitivity of model identification to different types of prior distributions. We use the Kullback-Leibler (KL) divergence (Kullback and Leibler, 1951) to measure the distance between the posteriors of the AR model order, resulting from different types of priors, in order to evaluate mathematically the sensitivity of the model identification to the employed priors.

Autoregressive Time Series Models and Bayesian Concepts
Kullback-Leibler Divergence Between Posterior Mass Functions of Model Order
Application
Simulation Study
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call