Abstract

Abstract We introduce a wearable-based recognition system for the classification of natural hand gestures during dynamic activities with surgical instruments. An armbandbased circular setup of eight EMG-sensors was used to superficially measure the muscle activation signals over the broadest cross-section of the lower arm. Instrument-specific surface EMG (sEMG) data acquisition was performed for 5 distinct instruments. In a first proof-of-concept study, EMG data were analyzed for unique signal courses and features, and in a subsequent classification, both decision tree (DTR) and shallow artificial neural network (ANN) classifiers were trained. For DTR, an ensemble bagging approach reached precision and recall rates of 0.847 and 0.854, respectively. The ANN network architecture was configured to mimic the ensemble-like structure of the DTR and achieved 0.952 and 0.953 precision and recall rates, respectively. In a subsequent multi-user study, classification achieved 70 % precision. Main errors potentially arise for instruments with similar gripping style and performed actions, interindividual variations in the acquisition procedure as well as muscle tone and activation magnitude. Compared to hand-mounted sensor systems, the lower arm setup does not alter the haptic experience or the instrument gripping, which is critical, especially in an intraoperative environment. Currently, drawbacks of the fixed consumer product setup are the limited data sampling rate and the denial of frequency features into the processing pipeline.

Highlights

  • The digital operating room (OR) is a highly outcome- and cost-driven environment and personnel, and hospital providers are continuously challenged to improve operation quality and efficiency

  • Channel 2 was most sensitive for perforator and sharp spoon activities

  • The cross-correlation of channel data showed that channels 4-7 had the lowest similarity values across all instrument activities

Read more

Summary

Introduction

The digital operating room (OR) is a highly outcome- and cost-driven environment and personnel, and hospital providers are continuously challenged to improve operation quality and efficiency. One aspect of optimization is to recognize the intraoperative workflow based on the surgical work steps or used resources, like surgical instruments. The main drawbacks are the limited line of sight into the work area, sensor contamination, as well as limited stationary acquisition capabilities and handling overhead for medical personnel. In the operating room, the functionalization of wearables for the identification of surgical tasks, e.g. using the recognition of natural user gestures, is still unexploited. Present an approach for a wearable-based recognition system using multi-EMG data of the surgeon’s lower arm movement. The system uses a circular sensor setup to acquire activity data of the surgeon through grasping information of the lower arm during usage of surgical instruments. We provide a proof-of-concept study and first multi-user classification, outlining the potential of the classification approach

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.