Abstract

Wound treatment in emergency care requires the rapid assessment of wound size by medical staff. Limited medical resources and the empirical assessment of wounds can delay the treatment of patients, and manual contact measurement methods are often inaccurate and susceptible to wound infection. This study aimed to prepare an Automatic Wound Segmentation Assessment (AWSA) framework for real-time wound segmentation and automatic wound region estimation. This method comprised a short-term dense concatenate classification network (STDC-Net) as the backbone, realizing a segmentation accuracy-prediction speed trade-off. A coordinated attention mechanism was introduced to further improve the network segmentation performance. A functional relationship model between prior graphics pixels and shooting heights was constructed to achieve wound area measurement. Finally, extensive experiments on two types of wound datasets were conducted. The experimental results showed that real-time AWSA outperformed state-of-the-art methods such as mAP, mIoU, recall, and dice score. The AUC value, which reflected the comprehensive segmentation ability, also reached the highest level of about 99.5%. The FPS values of our proposed segmentation method in the two datasets were 100.08 and 102.11, respectively, which were about 42% higher than those of the second-ranked method, reflecting better real-time performance. Moreover, real-time AWSA could automatically estimate the wound area in square centimeters with a relative error of only about 3.1%. The real-time AWSA method used the STDC-Net classification network as its backbone and improved the network processing speed while accurately segmenting the wound, realizing a segmentation accuracy-prediction speed trade-off.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.