Abstract

This study presents a novel method to realize an augmented reality (AR) and haptic feedback teleoperation ultrasound system, which can reduce the influence of time delay on the operator. The physician uses a haptic device to send commands to a remote site while observing the AR environment. In the virtual environment, the organ model is projected onto the corresponding position according to the human pose, and the UR robot model can respond to the physician's action in advance. Based on force data, the dynamic environmental model is updated by the recursive least-squares method, allowing real-time force feedback. The contact force is computed from the identified parameters together with the position and velocity of the master site. An experimental platform has been built, where experiments have been conducted to evaluate the overall architecture, highlighting the role of AR and contact-force prediction models, and including clinical validation. Results show that the system can correctly perform remote ultrasound scanning tasks and ensure authenticity and synchronization.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.