Abstract

ABSTRACT We propose a simple yet powerful method to construct strictly stationary Markovian models with given but arbitrary invariant distributions. The idea is based on a Poisson-type transform modulating the dependence structure in the model. An appealing feature of our approach is the possibility to control the underlying transition probabilities and, therefore, incorporate them within standard estimation methods. Given the resulting representation of the transition density, a Gibbs sampler algorithm based on the slice method is proposed and implemented. In the discrete-time case, special attention is placed to the class of generalized inverse Gaussian distributions. In the continuous case, we first provide a brief treatment of the class of gamma distributions, and then extend it to cover other invariant distributions, such as the generalized extreme value class. The proposed approach and estimation algorithm are illustrated with real financial datasets. Supplementary materials for this article are available online.

Highlights

  • Stationarity and other stability properties represent a crucial component in the theory and application of stochastic processes

  • Part of the dependence in the model is induced by the choice of marginal density, f, which in turn can be selected by the nature of the phenomenon or data under study

  • A related, but different, idea is the one proposed by Nakajima et al (2012), where innovations of the underlying latent process follow a Type I generalized extreme value (GEV) distribution and induce an observation response again characterized by a Type I GEV distribution

Read more

Summary

INTRODUCTION

Stationarity and other stability properties represent a crucial component in the theory and application of stochastic processes. Defining Markov models with prescribed invariant distributions poses a tradeoff between marginal and conditional properties, as one can have several models with different dependence structure while retaining the same stationary distribution This issue can be handled, to some extent, with a particular context in mind, for example, fulfilling certain continuity or dependency requirements. Within a similar setup, Pitt, Chatfield, and Walker (2002) exploited the reversibility property characterizing Gibbs sampler Markov chains to build strictly stationary AR(1)-type models with any choice of marginal distribution Their approach is very general and requires to make choices of dependence to accommodate the specific modeling needs. In the continuous-time setup, we use the gamma distribution as basic building block and obtain, via suitable transformations, a richer class of diffusion processes with known transition density This includes, for instance, diffusions with generalized extreme value (GEV) invariant distributions that, to the best of our knowledge, have not been derived before.

The Construction in Discrete Time
GIG-Stationary Poisson-Driven Markov Process
The General Case
Models Derived from a Ga-Stationary Poisson Driven Markov Process
BAYESIAN ESTIMATION
ILLUSTRATIONS
Simulated Data
Real Data
Findings
CONCLUDING REMARKS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.