Abstract

To estimate the number of unimodal components in a mixture model of a marginal probability distribution of signals while learning the model with a conventional Expectation-Maximization (EM) algorithm, a modification of the well-known Akaike information criterion (AIC) called the modified AIC (mAIC), is proposed. Embedding the mAIC into the EM algorithm allows us to exclude sequentially, one-by-one, the least informative components from their initially excessive, or over-fitting set. Experiments on modeling empirical marginal signal distributions with mixtures of continuous or discrete Gaussians in order to describe the visual appearance of synthetic phantoms and real medical 3D images (lung CT and brain MRI) demonstrate a marked and monotone increase of the mAIC towards its maximum at the proper number that is known for the synthetic phantom or practically justified for the real image. These results confirm the accuracy and robustness of the proposed automated mAIC-EM based learning.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.