Abstract

PurposePressure ulcer is a clinical pathology of localized damage to the skin and underlying tissue caused by pressure, shear, and friction. Diagnosis, treatment and care of pressure ulcers involve high costs for sanitary systems. Accurate wound evaluation is a critical task to optimize the efficacy of treatments and health‐care. Clinicians evaluate the pressure ulcers by visual inspection of the damaged tissues, which is an imprecise manner of assessing the wound state. Current computer vision approaches do not offer a global solution to this particular problem. The purpose of this paper is to use a hybrid learning approach based on neural and Bayesian networks to design a computational system to automatic tissue identification in wound images.Design/methodology/approachA mean shift procedure and a region‐growing strategy are implemented for effective region segmentation. Color and texture features are extracted from these segmented regions. A set of k multi‐layer perceptrons is trained with inputs consisting of color and texture patterns, and outputs consisting of categorical tissue classes determined by clinical experts. This training procedure is driven by a k‐fold cross‐validation method. Finally, a Bayesian committee machine is formed by training a Bayesian network to combine the classifications of the k neural networks (NNs).FindingsThe authors outcomes show high efficiency rates from a two‐stage cascade approach to tissue identification. Giving a non‐homogeneous distribution of pattern classes, this hybrid approach has shown an additional advantage of increasing the classification efficiency when classifying patterns with relative low frequencies.Practical implicationsThe methodology and results presented in this paper could have important implications to the field of clinical pressure ulcer evaluation and diagnosis.Originality/valueThe novelty associated with this work is the use of a hybrid approach consisting of NNs and Bayesian classifiers which are combined to increase the performance of a pattern recognition task applied to the real clinical problem of tissue detection under non‐controlled illumination conditions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.