Abstract

This paper presents a machine learning-based guidance (LbG) approach for kinesthetic human-robot interaction (HRI) that can be used in virtual training simulations. Demonstrated positional and force skills are learned to both discriminate the skill levels of users and produce LbG forces. Force information is obtained from virtual forces, which developed based on real computed tomography (CT) data, rather than force sensors. A femur bone drilling simulation is developed to provide a practice environment for orthopaedic residents. The residents are provided with haptic feedback that enable them to feel the variable stiffness of bone layers. The X-ray views of the bone are also presented to them for better tracking of a pre-defined path inside the bone. The simulation is capable of planning a drill path, generating X-rays based on user defined orientation, and recording motion data for user assessment and skill modeling. The knowledge of expert surgeons is also incorporated into the simulation to provide LbG forces for improving the unpredictable motions of the residents. To discriminate the skill level of users, machine learning tools are used to develop surgical expert and resident models. In addition, to improve residents performance, the expert HCRF is used to generate adaptive LbG forces regarding the similarities between residents motions and the expert model. Experimental results show that the learning-based approach is able to assess the skill of users and improve residents performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call