You have accessJournal of UrologySurgical Technology & Simulation: Instrumentation & Technology II (PD41)1 Sep 2021PD41-03 MULTI-INSTITUTION COMPARISON OF NINE FLEXIBLE URETEROSCOPES USING A VALIDATED FLEXIBLE URETEROSCOPE EVALUATION TOOL Margaret Knoedler, Scott Quarrier, Shuang Li, Alex Uhr, Shreya Patel, John Bell, Kristina Penniston, Scott Hubosky, Rajat Jain, and Stephen Nakada Margaret KnoedlerMargaret Knoedler More articles by this author , Scott QuarrierScott Quarrier More articles by this author , Shuang LiShuang Li More articles by this author , Alex UhrAlex Uhr More articles by this author , Shreya PatelShreya Patel More articles by this author , John BellJohn Bell More articles by this author , Kristina PennistonKristina Penniston More articles by this author , Scott HuboskyScott Hubosky More articles by this author , Rajat JainRajat Jain More articles by this author , and Stephen NakadaStephen Nakada More articles by this author View All Author Informationhttps://doi.org/10.1097/JU.0000000000002051.03AboutPDF ToolsAdd to favoritesDownload CitationsTrack CitationsPermissionsReprints ShareFacebookLinked InTwitterEmail Abstract INTRODUCTION AND OBJECTIVE: Clinical urologists need a standardized way to compare the quality of flexible ureteroscopes for clinical and research purposes. We validated a flexible ureteroscope evaluation tool (Bell 2017). Our objective was to apply the tool widely to test its consistency across multiple institutions and multiple ureteroscopes. METHODS: Ureteroscopes (n=9) were assessed using the ureteroscope evaluation tool by three academic institutions. Surgeons completed the evaluation tool at the end of cases for which a variety of flexible ureteroscopes were used [Stortz Flex Xc (Xc), Stortz Flex X2 (X2), Olympus URF-P5 (P5), Olympus URF-P6 (P6), Olympus URF-P7 (P7), WiScope, LithoVue, Dornier and Pusen]. The tool contains nine evaluation domains designed to provide a comprehensive assessment of the ureteroscope (Figure 1). The domains are: historic experience with the device, quality of image, strength of deflection, scope maneuverability, intuitiveness of controls, ease of irrigation, ease of ureteroscopic access, functionality of working channel and overall satisfaction. SPSS was used to conduct statistical analysis for internal consistency of the tool and MANOVA for comparing the scopes across domains. RESULTS: A total of 311 responses distributed across three institutions were completed for the nine ureteroscopes. Results demonstrated the tool’s internal consistency (Cronbach alpha coefficient=0.85). The ureteroscope with the highest overall satisfaction score was the P7 (4.81±0.26) and the lowest overall satisfaction score was the Pusen (1.43±0.79). Mean scores across domains are reported in Figure 2. CONCLUSIONS: Our evaluation tool can be used to evaluate new ureteroscopes as they come to market. This tool is applicable to a variety of ureteroscopes with consistent results across multiple institutions and over time. This is extremely important as urologists do not always have the ability to test each ureteroscope in hand. Source of Funding: None © 2021 by American Urological Association Education and Research, Inc.FiguresReferencesRelatedDetails Volume 206Issue Supplement 3September 2021Page: e681-e681 Advertisement Copyright & Permissions© 2021 by American Urological Association Education and Research, Inc.MetricsAuthor Information Margaret Knoedler More articles by this author Scott Quarrier More articles by this author Shuang Li More articles by this author Alex Uhr More articles by this author Shreya Patel More articles by this author John Bell More articles by this author Kristina Penniston More articles by this author Scott Hubosky More articles by this author Rajat Jain More articles by this author Stephen Nakada More articles by this author Expand All Advertisement Loading ...
Read full abstract