Abstract

The surge of interest in haptic technology is due to inspirational advances in the robotic-assisted surgical system, where haptics has the role of delivering tactile feedback for enhancement of user experience. This work presents a Long Short Term Memory (LSTM) based Recurrent Neural Network (RNN) framework with Dimensionality Reduction (DR) and a Cyclical Learning Rate (CLR) optimizer for reproducing variable forces produced in different skin layers during the performance of various surgical procedures. This paper deals with online estimation of the force parameters of original porcine skin, and the same has been tested in real-time and Visuo-haptic environment for training surgeons. The proposed model has processed both spatial and temporal information acquired from three different dataset, surgical tools and manipulator. The results of proposed framework RNN-LSTM + DR + CLR show a 9.23 % & 3.8 % improvement in force prediction accuracy in real-time and 7.11 % & 1.68 % improvement in Visuo-haptic simulation compared to the RNN and RNN-LSTM prediction frameworks, respectively. The sensitivity analysis shows that torque (97.62 %), position (94.54 %), deformation (93.20 %), stiffness (89.23 %), tool diameter (87.25 %), rotation (63.21 %), and orientation (62.56 %) features have a respective impact on the predicted force. The performance of RNN-LSTM was better when the network was optimized with dimensionality reduction, loss function as Root Mean Square Error (RMSE), and learning rate as Cyclical Learning Rate (CLR). The research outcomes show the effectiveness of the method for estimating the force on the surface and internal layers of the skin. Also, the method has applications in real-time surgical tasks and surgeon training.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call