Abstract

In routine ultrasonography, slice images of the internal organs of the human body are usually generated through an 1D array probe. The position and orientation of the probe is adjusted manually to obtain slice planes with pathological features. This is quite dependent on the experience and technique of a sonographer. This paper aims to locate 2D slice planes in a 3D breast ultrasound volume, which has significant application value in clinical ultrasound examinations. We propose a deep learning approach mapping all possible 2D image slices to their 3D coordinates parameters using a fully connected neural network implemented on MATLAB. We emphasize that this training must be done separately for each patient since the mammary tissue structure varies greatly from one person to another. The trained network can be interpreted as an image-slice location database for each patient. Our study is validated on GE ABUS (Automated Breast Ultrasound System) volume data. Each 2D image slice has four spatial parameters. The method achieves a prediction error of 0.14mm/0.25mm and 0.5 degree/0.3 degree for translation (x/y) and rotation (yaw/roll) parameters respectively, averaged over all practically scannable slices. It takes less than 0.1ms to predict the location of one 64×64 slice image. Thus, slice locations may be displayed with high accuracy in real-time when scanning with a conventional 1D probe, potentially allowing physicians to manipulate the probe to any scan planes of interest.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call