Abstract

Under the situation of severe COVID-19 epidemic, lung ultrasound (LUS) has been proved to be an effective and convenient method to diagnose and evaluate the extent of respiratory disease. However, the traditional clinical ultrasound (US) scanning requires doctors not only to be in close contact with patients but also to have rich experience. In order to alleviate the shortage of medical resources and reduce the work stress and risk of infection for doctors, we propose a visual perception and convolutional neural network (CNN)-based robotic autonomous LUS scanning localization system to realize scanned target recognition, probe pose solution and movement, and the acquisition of US images. The LUS scanned targets are identified through the target segmentation and localization algorithm based on the improved CNN, which is using the depth camera to collect the image information; furthermore, the method based on multiscale compensation normal vector is used to solve the attitude of the probe; finally, a position control strategy based on force feedback is designed to optimize the position and attitude of the probe, which can not only obtain high-quality US images but also ensure the safety of patients and the system. The results of human LUS scanning experiment verify the accuracy and feasibility of the system. The positioning accuracy of the scanned targets is 15.63 ± 0.18 mm, and the distance accuracy and rotation angle accuracy of the probe position calculation are 6.38 ± 0.25 mm and 8.60° ±2.29° , respectively. More importantly, the obtained high-quality US images can clearly capture the main pathological features of the lung. The system is expected to be applied in clinical practice.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.