Abstract

A novel software, DiffTool, was developed in-house to keep track of changes made by board-certified radiologists to preliminary reports created by residents and evaluate its impact on radiological hands-on training. Before (t0) and after (t2−4) the deployment of the software, 18 residents (median age: 29 years; 33% female) completed a standardized questionnaire on professional training. At t2−4 the participants were also requested to respond to three additional questions to evaluate the software. Responses were recorded via a six-point Likert scale ranging from 1 (“strongly agree”) to 6 (“strongly disagree”). Prior to the release of the software, 39% (7/18) of the residents strongly agreed with the statement that they manually tracked changes made by board-certified radiologists to each of their radiological reports while 61% were less inclined to agree with that statement. At t2−4, 61% (11/18) stated that they used DiffTool to track differences. Furthermore, we observed an increase from 33% (6/18) to 44% (8/18) of residents who agreed to the statement “I profit from every corrected report”. The DiffTool was well accepted among residents with a regular user base of 72% (13/18), while 78% (14/18) considered it a relevant improvement to their training. The results of this study demonstrate the importance of providing a time-efficient way to analyze changes made to preliminary reports as an additive for professional training.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call