Abstract

Background: There are 8.5 million Americans who suffer from a chronic wound. Due to the lack of an objective system to measure and characterize wounds, the current standard of care relies highly on provider guesswork. This leads to misinformed care decisions, ultimately leading to prolonged healing times and high healthcare expenditures. Objective: This study describes the design and validation of a smartphone image-based system for accurately measuring and characterizing chronic wounds in an automated and objective fashion. Methods: Photos (n=81) were collected by the study team from patients (n=25) at the Johns Hopkins Bayview Wound Clinic in an IRB-approved study. Photos were taken using a variety of smartphones such that our training data set would include nuances of different smartphone cameras. We combined supervised image classification and computer vision to detect wound edges and segment the tissues within the wound. 15 individuals (“raters”) with various levels of training were then instructed to trace wound regions in a diverse subset of the wound images arbitrarily selected by the study team (n=10). The ensemble wound edge and tissue segmentation algorithms were compared against an 80% inter-rater gold standard. Results: The automated method resulted in a sensitivity = 98.31 ± 2.18 and specificity = 92.06 ± 7.86). In contrast, the ruler-based measurement resulted in sensitivity = 1 ± 0, Specificity = 0.57 ± 0.30. A normalized area measurement for the automated method resulted in a normalized area of 1.14 ± 0.17. In comparison, the standard of care method resulted in a normalized area of 1.86 ± 0.30 relative to gold standard. With respect to tissue segmentation, the overall average tissue classification accuracy on k-fold cross validation using the sparse neural network method is 93.6% ± 3.3%. Conclusions: The result illustrates the large overestimation of wound size that occurred when the wounds were measured using the ruler measurement. It also corroborates the literature-reported value of measurement inaccuracy by standard methods. Our study shows the feasibility of an easily deployed smartphone system to classify wounds in an automated manner with high accuracy. Such a system could be used to objectify measurements by nurses in the home care environment, thus improving the accuracy of wound care and, potentially, the outcomes of patients.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.