Abstract

Ultrasound-based navigation is a promising method in breast-conserving surgery, but tumor contouring often requires a radiologist at the time of surgery. Our goal is to develop a real-time automatic neural network-based tumor contouring process for intraoperative guidance. Segmentation accuracy is evaluated by both pixel-based metrics and expert visual rating. This retrospective study includes 7318 intraoperative ultrasound images acquired from 33 breast cancer patients, randomly split between 80:20 for training and testing. We implement a u-net architecture to label each pixel on ultrasound images as either tumor or healthy breast tissue. Quantitative metrics are calculated to evaluate the model's accuracy. Contour quality and usability are also assessed by fellowship-trained breast radiologists and surgical oncologists. Additionally, the viability of using our u-net model in an existing surgical navigation system is evaluated by measuring the segmentation frame rate. The mean dice similarity coefficient of our u-net model is 0.78, with an area under the receiver-operating characteristics curve of 0.94, sensitivity of 0.95, and specificity of 0.67. Expert visual ratings are positive, with 93% of responses rating tumor contour quality at or above 7/10, and 75% of responses rating contour quality at or above 8/10. Real-time tumor segmentation achieved a frame rate of 16 frames-per-second, sufficient for clinical use. Neural networks trained with intraoperative ultrasound images provide consistent tumor segmentations that are well received by clinicians. These findings suggest that neural networks are a promising adjunct to alleviate radiologist workload as well as improving efficiency in breast-conserving surgery navigation systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.