Abstract

The authors present a deep learning algorithm for the automatic centroid localisation of out-of-plane US needle reflections to produce a semi-automatic ultrasound (US) probe calibration algorithm. A convolutional neural network was trained on a dataset of 3825 images at a 6 cm imaging depth to predict the position of the centroid of a needle reflection. Applying the automatic centroid localisation algorithm to a test set of 614 annotated images produced a root mean squared error of 0.62 and 0.74 mm (6.08 and 7.62 pixels) in the axial and lateral directions, respectively. The mean absolute errors associated with the test set were 0.50 ± 0.40 mm and 0.51 ± 0.54 mm (4.9 ± 3.96 pixels and 5.24 ± 5.52 pixels) for the axial and lateral directions, respectively. The trained model was able to produce visually validated US probe calibrations at imaging depths on the range of 4–8 cm, despite being solely trained at 6 cm. This work has automated the pixel localisation required for the guided-US calibration algorithm producing a semi-automatic implementation available open-source through 3D Slicer. The automatic needle centroid localisation improves the usability of the algorithm and has the potential to decrease the fiducial localisation and target registration errors associated with the guided-US calibration method.

Highlights

  • The US probe calibration method that is used for this work is based on the Guided US Calibration (GUSCAL), which formulates the US calibration as a Procrustean point-to-line registration problem [5]

  • The tracked needle GUSCAL method requires localisation of the centroid of out-of-plane needle reflections, where, rather than the entire needle being in the US plane, it is inserted at an oblique angle intersecting the US plane, producing a cross-sectional reflection of the needle shaft on a black background [5]

  • In previous work, we found that the manual localisations result in fiducial localisation error (FLE) that propagates into target registration error (TRE)

Read more

Summary

Introduction

Ultrasound (US) scanners are common in image-guided interventions as they produce real-time imaging without exposing the patient to harmful ionising radiation [1]. Mixed-reality US-guided surgical navigation systems aim to improve the usability of US-guided interventions by using a 3D virtual environment to provide a visual relationship between tracked surgical instruments and real-time US images [2]. These systems rely on US probe calibration to establish the spatial transformation between the US image and a tracking sensor attached to the transducer [3]. The US probe calibration method that is used for this work is based on the Guided US Calibration (GUSCAL), which formulates the US calibration as a Procrustean point-to-line registration problem [5]. The tracked needle GUSCAL method requires localisation of the centroid of out-of-plane needle reflections, where, rather than the entire needle being in the US plane, it is inserted at an oblique angle intersecting the US plane, producing a cross-sectional reflection of the needle shaft on a black background [5]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call