Abstract

We present a semi-autonomous robotic ultrasound system, which provides a novel way to automate the ultrasound image-taking process. The system uses force feedback and a PID controller, a convolutional neural-network (CNN) image classifier to provide direction for movement. The user locates and gives the arm the coordinates at the approximate position of the heart of a patient, which the system uses as its starting point when taking ultrasound images. When the arm returns to the position to take an ultrasound, a PID controller maintains the desired force for an ultrasound, without patient inconvenience. An ultrasound video is then parsed through a CNN frame by frame and classified by quality of image: satisfactory, not satisfactory, and partially satisfactory. Partial images, containing a part of the heart, are then used to reposition the ultrasound. The repositioning function analyzes the intensity profile of the average image array and uses it to instruct the arm to make appropriate adjustments to obtain a satisfactory image. Overall, the ultrasound robot was successful at taking ultrasound images, and provides a novel methodology of using robots for ultrasound imaging.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call