Abstract

AbstractSuppose that a compound Poisson process is observed discretely in time and assume that its jump distribution is supported on the set of natural numbers. In this paper we propose a nonparametric Bayesian approach to estimate the intensity of the underlying Poisson process and the distribution of the jumps. We provide a Markov chain Monte Carlo scheme for obtaining samples from the posterior. We apply our method on both simulated and real data examples, and compare its performance with the frequentist plug‐in estimator proposed by Buchmann and Grübel. On a theoretical side, we study the posterior from the frequentist point of view and prove that as the sample size n→∞, it contracts around the “true,” data‐generating parameters at rate , up to a factor.

Highlights

  • Let N = (Nt ∶ t ≥ 0) be a Poisson process with a constant intensity λ > 0, and let Yi be a sequence of independent random variables, each with distribution P, that are independent of N

  • We restrict our attention to the case where P is a discrete distribution, P (N) = 1, and we will write p =

  • In terms of computational effort, the time it takes to evaluate the Buchmann-Grübel estimator is negligible compared to our algorithm for sampling from the posterior. This is not surprising, as that frequentist estimator relies on a plug-in approach, whereas in our case an approximation to the posterior is obtained by Markov chain Monte Carlo (MCMC) simulation

Read more

Summary

Problem formulation

Let N = (Nt ∶ t ≥ 0) be a Poisson process with a constant intensity λ > 0, and let Yi be a sequence of independent random variables, each with distribution P, that are independent of N. A compound Poisson process (abbreviated CPP) X = (Xt ∶ t ≥ 0) is. Assume that the process X is observed at discrete times 0 < t1 < t2 < ... Xtn , our goal is to estimate the jump size distribution P and the intensity λ. We will at times identify it with the corresponding probability mass function p. When {ti} are equidistant on [0, T], the random variables Zi have a common distribution Q satisfying Q(N0) = 1. N) does, we can base our estimation procedure directly on the increments n. For our specific statistical approach the Lévy measure parameterization turns out to be more advantageous from the computational point of view

Approach and results
Related literature
Outline
Notation
ALGORITHM FO R D RAWING FROM THE POSTERIOR
Data augmentation
SIMULATION EXAMPLES
Uniform base distribution
Geometric base distribution
Monte Carlo study
Computing time
Horse kick data
Plant data
FREQUENTIST ASYMPTOTICS
Basic posterior inequality via the stability estimate
Proof of Theorem 1
Entropy
Prior mass
OUTLOOK
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call