Abstract

Among statistical models, Gaussian Mixture Models (GMMs) have been used in numerous applications to model the data in which a mixture of Gaussian curves fits them. Several methods have been introduced to estimate the optimum parameters to a GMM fitted to the data. The accuracy of such estimation methods is crucial to interpret the data. In this paper, we proposed a new approach to estimate the parameters of a GMM using critical points of Tsallis-entropy to adjust each parameter's accuracy. To evaluate the proposed method, seven GMMs of simulated random (noisy) samples generated by MATLAB were used. Each simulated model was repeated 1000 times to generates 1000 random values obeying the GMM. In addition, five GMM shaped samples extracted from magnetic resonance brain images were used, aiming for image segmentation application. For comparison assessment, Expectation-Maximization, K-means, and Shannon's estimator were employed on the same dataset. These four estimation methods using accuracy, Akaike information criterion (AIC), Bayesian information criterion (BIC), and Mean Squared Error (MSE) were evaluated. The mean accuracies of the Tsallis-estimator for simulated data, i.e., the mean values, variances, and proportions, were 99.9(±0.1), 99.8(±0.2), and 99.7(±0.3)%, respectively. For both datasets, the Tsallis-estimator accuracies were significantly higher than EM, K-means, and Shannon. Tsallis-estimator, increasing the estimated parameters' accuracy, can be used in statistical approaches and machine learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call