Objectives Virtual reality (VR) is an increasingly valuable teaching tool, but current simulators are not typically clinically scalable due to their reliance on inefficient manual segmentation. The objective of this project was to leverage a high-throughput and accurate machine learning method to automate data preparation for a patient-specific VR simulator used to explore preoperative sinus anatomy. Methods An endoscopic VR simulator was designed in Unity to enable interactive exploration of sinus anatomy. The Saak transform, a data-efficient machine learning method, was adapted to accurately segment sinus computed tomography (CT) scans using minimal training data, and the resulting data were reconstructed into three-dimensional (3D) patient-specific models that could be explored in the simulator. Results Using minimal training data, the Saak transform-based machine learning method offers accurate soft-tissue segmentation. When explored with an endoscope in the VR simulator, the anatomical models generated by the algorithm accurately capture key sinus structures and showcase patient-specific variability in anatomy. Conclusion By offering an automatic means of preparing VR models from a patient's raw CT scans, this pipeline takes a key step toward clinical scalability. In addition to preoperative planning, this system also enables virtual endoscopy-a tool that is particularly useful in the COVID-19 era. As VR technology inevitably continues to develop, such a foundation will help ensure that future innovations remain clinically accessible.
Read full abstract