Abstract
In clinical routine, wound documentation is one of the most important contributing factors to treating patients with acute or chronic wounds. The wound documentation process is currently very time-consuming, often examiner-dependent, and therefore imprecise. This study aimed to validate a software-based method for automated segmentation and measurement of wounds on photographic images using the Mask R-CNN (Region-based Convolutional Neural Network). During the validation, five medical experts manually segmented an independent dataset with 35 wound photographs at two different points in time with an interval of 1month. Simultaneously, the dataset was automatically segmented using the Mask R-CNN. Afterwards, the segmentation results were compared, and intra- and inter-rater analyses performed. In the statistical evaluation, an analysis of variance (ANOVA) was carried out and dice coefficients werecalculated. The ANOVA showed no statistically significant differences throughout all raters and the network in the first segmentation round (F = 1.424 and p > 0.228) and the second segmentation round (F = 0.9969 and p > 0.411). The repeated measure analysis demonstrated no statistically significant differences in the segmentation quality of the medical experts over time (F = 6.05 and p > 0.09). However, a certain intra-rater variability was apparent, whereas the Mask R-CNN consistently provided identical segmentations regardless of the point in time. Using the software-based method for segmentation and measurement of wounds on photographs can accelerate the documentation process and improve the consistency of measured values while maintaining quality and precision.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.