Abstract
Many operations in robot-assisted surgery (RAS) can be viewed in a hierarchical manner. Each surgical task is represented by a superstate, which can be decomposed into finer-grained states. The estimation of these discrete states at different levels of temporal granularity provides a temporal perception of the current surgical scene during RAS, which is a crucial step towards many automated surgeon-assisting functionalities. We propose Hierarchical Estimation of Surgical States through Deep Neural Networks (HESS-DNN), a deep learning-based system that concurrently estimates the current super- and fine-grained states. HESS-DNN incorporates endoscopic vision, robot kinematics, and system events data from the da Vinci Xi surgical system. HESS-DNN is evaluated on a real-world robotic inguinal hernia repair surgery dataset: HERNIA-20, and achieves accurate state estimates of both surgical superstate and the corresponding fine-grained surgical state. We show that HESS-DNN improves state-of-the-art fine-grained state estimation across the entire HERNIA-20 RAS procedure through its hierarchical design. We also analyze the relative contributions of each input data type and HESS-DNN's design to surgical (super)state estimation accuracy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.