Abstract

Histotripsy is a focused ultrasound therapy that ablates tissue via the action of bubble clouds. It is under investigation to treat a number of ailments, including renal tumors. Ultrasound imaging is used to monitor histotripsy, though there remains a lack of definitive imaging metrics to confirm successful treatment outcomes. In this study, a convolutional neural network (CNN) was developed to segment ablation on ultrasound images. A transfer learning approach was used to replace classification layers of the residual network ResNet-18. Inputs to the classification layers were based on ultrasound images of ablated red blood cell phantoms. Digital photographs served as the ground truth. The efficacy of the CNN was compared to subtraction imaging, and manual segmentation of images by two board-certified radiologists. The CNN had a similar performance to manual segmentation, though was improved relative to segmentation with subtraction imaging. Predictions of the network improved over the course of treatment, with the Dice similarity coefficient less than 20% for fewer than 500 applied pulses, but 85% for more than 750 applied pulses. The network was also applied to ultrasound images of ex vivo kidney exposed to histotripsy, which indicated a morphological shift in the treatment profile relative to the phantoms. These findings were consistent with histology that confirmed ablation of the targeted tissue. Overall, the CNN showed promise as a rapid means to assess outcomes of histotripsy and automate treatment. Data collected in this study indicate integration of CNN image segmentation to gauge outcomes for histotripsy ablation holds promise for automating treatment procedures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call