Abstract

We discuss a parameter estimation problem for a Gaussian copula model under misspecification. Conventional estimators such as the maximum likelihood estimator (MLE) do not work well if the model is misspecified. We propose the estimator that minimizes the projective power entropy. We call it the -estimator, where denotes the power index. A feasible form of the projective power entropy is given that suites the Gaussian copula model. It is shown that the -estimator is robust against outliers. In addition the -estimator can appropriately detect a heterogeneous structure of the underlying distribution, even if the underlying distribution consists of some different copula components while a single Gaussian copula is used as a statistical model. We explore such an ability of the -estimator to detect the local structures in the comparison with the MLE. We also propose a fixed point algorithm to obtain the -estimator. The usefulness of the proposed methodology is demonstrated in numerical experiments.

Highlights

  • Applications of copula models have been increasing in number in recent years

  • In addition the γ-estimator can appropriately detect a heterogeneous structure of the underlying distribution, even if the underlying distribution consists of some different copula components while a single Gaussian copula is used as a statistical model. We explore such an ability of the γ-estimator to detect the local structures in the comparison with the maximum likelihood estimator (MLE)

  • As far as we know, there exist only a few works that are tackled with the identification and the statistical estimation of the mixture of copula models and most of them rely on MCMC algorithm

Read more

Summary

Introduction

Applications of copula models have been increasing in number in recent years. There are a variety of applications on finance, risk management [1] and multivariate time series analysis [2]. Our research shows that even if a single Gaussian copula model is incorrectly fitted to the data from the mixture distribution (1), the γ-estimator can detect both P1 and P2 separately if P1 and P2 are “distinct” enough and τ is close to 0.5. The projective power cross entropy, which is a function of P, has only one local minimum or some local minima depending on the underlying distribution. We show that if P1 and P2 are “distinct” enough and τ is near 0.5, the projective power cross entropy between the underlying mixture distribution (1) and the Gaussian copula cG(u, P) has two local minimizers near P1 and P2, respectively, so we propose to use these local minimizers to detect P1 and P2. The proofs for all the theoretical results are provided in the appendix

Estimation of the Gaussian Copula Model
Projective Power Entropy and γ-Estimator
Property of the γ-Estimator
Maximum Entropy Distribution
Robustness
Simulation Study
Result
Discussion
Derivation for the Algorithm
Proof of Theorem 1
Proof of Theorem 5
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call