Abstract

In the medical world, with the innovative application of medical informatics, it is possible to enable many aspects of surgeries that were not able to be addressed before. One of these is contactless surgery planning and controlling the visualization of medical data. In our approach to contactless surgery, we adopted a new framework for hand and motion detection based on augmented reality. We developed a contactless interface for a surgeon to control the visualization options in our DICOM (Digital Imaging and Communications in Medicine) viewer platform that uses a stereo camera as a sensor device input that controls hand/finger motions, in contactless mode, and applied it to 3D virtual endoscopy. In this paper, we will present our proposal for defining motion parameters in contactless, incisionless surgeries. We enabled better surgeon’s experience, more precise surgery, real-time feedback, depth motion tracking, and contactless control of visualization, which gives freedom to the surgeon during the surgery. We implemented motion tracking using stereo cameras with depth resolution and precise shutter sensors for depth streaming. Our solution provides contactless control with a range up to 2–3 m that enables the application in the operating room.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.