BackgroundSurgical context-aware systems can adapt to the current situation in the operating room and thus provide computer-aided assistance functionalities and intraoperative decision-support. To interact with the surgical team perceptively and assist the surgical process, the system needs to monitor the intraoperative activities, understand the current situation in the operating room at any time, and anticipate the following possible situations. MethodsA structured representation of surgical process knowledge is a prerequisite for any applications in the intelligent operating room. For this purpose, a surgical process ontology, which is formally based on standard medical terminology (SNOMED CT) and an upper-level ontology (GFO), was developed and instantiated for a neurosurgical use case. A new ontology-based surgical workflow recognition and a novel prediction method are presented utilizing ontological reasoning, abstraction, and explication. This way, a surgical situation representation with combined phase, high-level task, and low-level task recognition and prediction was realized based on the currently used instrument as the only input information. ResultsThe ontology-based approach performed efficiently, and decent accuracy was achieved for situation recognition and prediction. Especially during situation recognition, the missing sensor information were reasoned based on the situation representation provided by the process ontology, which resulted in improved recognition results compared to the state-of-the-art. ConclusionsIn this work, a reference ontology was developed, which provides workflow support and a knowledge base for further applications in the intelligent operating room, for instance, context-aware medical device orchestration, (semi-) automatic documentation, and surgical simulation, education, and training.
Read full abstract