Abstract

We propose here an unsupervised Bayesian classifier based on a non-parametric expectation-maximization algorithm. The non-parametric aspect comes from the use of the orthogonal probability density function (pdf) estimation, which is reduced to the estimation of the first Fourier coefficients of the pdf with respect to a given orthogonal basis. So, the mixture identification step based on the maximization of the likelihood can be realized without hypothesis on the conditional pdf’s distribution. This means that for the unsupervised image segmentation example we do not need any assumption for the gray level image pixels distribution. The generalization to the multivariate case can be obtained by considering the multidimensional orthogonal function basis. In this paper, we give some simulation results for the determination of the smoothing parameter and to compute the error of classification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.