Abstract
Most of the tools and diagnosis models of Masticatory Efficiency (ME) are not well documented or severely limited to simple image processing approaches. This study presents a novel expert system for ME assessment based on automatic recognition of mixture patterns of masticated two-coloured chewing gums using a combination of computational intelligence and image processing techniques. The hypotheses tested were that the proposed system could accurately relate specimens to the number of chewing cycles, and that it could identify differences between the mixture patterns of edentulous individuals prior and after complete denture treatment. This study enrolled 80 fully-dentate adults (41 females and 39 males, 25 ± 5 years of age) as the reference population; and 40 edentulous adults (21 females and 19 males, 72 ± 8.9 years of age) for the testing group. The system was calibrated using the features extracted from 400 samples covering 0, 10, 15, and 20 chewing cycles. The calibrated system was used to automatically analyse and classify a set of 160 specimens retrieved from individuals in the testing group in two appointments. The ME was then computed as the predicted number of chewing strokes that a healthy reference individual would need to achieve a similar degree of mixture measured against the real number of cycles applied to the specimen. The trained classifier obtained a Mathews Correlation Coefficient score of 0.97. ME measurements showed almost perfect agreement considering pre- and post-treatment appointments separately (κ ≥ 0.95). Wilcoxon signed-rank test showed that a complete denture treatment for edentulous patients elicited a statistically significant increase in the ME measurements (Z = -2.31, p < 0.01). We conclude that the proposed expert system proved able and reliable to accurately identify patterns in mixture and provided useful ME measurements.
Highlights
This study explores the possibility to accurately measure the level of mixture in two-coloured chewing gums subjected to mastication within a comprehensive reference scale (0 to 100%)
The proposed expert system involves the identification of mixture patterns in two-coloured chewing gums; these patterns are the basis of a classification procedure to compute the P score of new specimens obtained from masticatory-compromised individuals
These are related by an auxiliary component called Masticatory Efficiency and Performance Assessment Technique (MEPAT) which contains the information generated from the calibration in order to accurately perform masticatory assessment tests
Summary
KnowledgebaseThe proposed expert system involves the identification of mixture patterns in two-coloured chewing gums; these patterns are the basis of a classification procedure to compute the P score (see Eq 1) of new specimens obtained from masticatory-compromised individuals. It is important to notice that the structural characteristics of a chewing gum brand are related to its visual characteristics; besides, a brand’s availability may not be the same worldwide To overcome these problems the proposed system comprises two main components, as shown in Fig 2: a calibration stage oriented to identify patterns in the mixture of a reference population for a selected chewing gum brand; and a diagnosis stage oriented determine the ME of a patient using the same brand of chewing gums. The calibration stage aims to identify patterns in the visual characteristics of masticated twocoloured chewing gums by analysing a broad distribution of reference samples This process can be resource-intensive and time-consuming because it requires the participation of various reference individuals and the analysis of multiple samples per participant. The calibration process comprises the test-food selection, reference population selection, sample retrieval, digitization, segmentation, feature extraction, feature selection, machine learning, and classifier validation steps; which are described
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.