Abstract

In this paper, we introduced the infinite continuous mixture of Dirichlet distributions as a generalization of the infinite mixture of Dirichlet ones, in order to avoid the limitation of choosing the a priori sample size for the expectation \textit{a posteriori} estimator. Monte-Carlo sampling was used in order to obtain the \textit{posterior} distributions mixture, since this mixture is difficult to get analytically. A new parametrization of this proposed distribution was achieved. Then, we suggested a mixture expectation \textit{a posteriori} estimator of the unknown parameters. The proposed estimator solves the problem of how to construct a Bayesian estimation of proportions without specifying particular parameters and sample size of the prior knowledge. Some asymptotic properties of this estimator were derived, specifically, its bias and variance. The consistency and asymptotic normality of the estimator were also established when the sample size tends to infinity and its credible interval was determined. The performance of the proposed estimator was illustrated theoretically and by means of a simulation study. Ultimately, a comparative simulation study between the learned estimates, the proposed mixture expectation \textit{a posteriori}, standard Bayesian estimator, maximum likelihood and Jeffreys estimator, was established. According to this simulation, we were able to conclude that the prior infinite mixture of Dirichlet distributions offers higher accuracy and flexibility for modeling and learning data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call