Abstract

In this paper, we apply Grenander’s method of sieves to the problem of estimation of the infinite dimensional parameter in a nonstationary linear diffusion model. We use an increasing sequence of finite dimensional subspaces of the parameter space as the natural sieves on which we maximize the likelihood function. We show that if the dimension of the sieves tends to infinity with the sample size with a rate not too fast then the sequence of restricted maximum likelihood estimators for the parameter is consistent and asymptotically normal.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call