Abstract

Physician competency assessment requires the use of validated methods and instruments. The Royal Australian and New Zealand College of Radiologists (RANZCR) developed a draft audit form to be evaluated as a competency assessment instrument for radiation oncologists (ROs) in Australasia. We evaluated the reliability of the RANZCR instrument as well as a separate The Cancer Institute (TCI) Singapore-designed instrument by having two ROs perform an independent chart review of 80 randomly selected patients seen at The Cancer Institute (TCI), Singapore. Both RANZCR and TCI Singapore instruments were used to score each chart. Inter- and intra-observer reliability for both audit instruments were compared using misclassification rates as the primary end-point. Overall, for inter-observer reproducibility, 2.3% of TCI Singapore items were misclassified compared to 22.3% of RANZCR items (P < 0.0001, 100.00% confidence that TCI instrument has less inter-observer misclassification). For intra-observer reproducibility, 2.4% of TCI Singapore items were misclassified compared to 13.6% of RANZCR items (P < 0.0001, 100.00% confidence that TCI instrument has less intra-observer misclassification). The proposed RANZCR RO revalidation audit instrument requires further refinement to improve validity. Several items require modification or removal because of lack of reliability, whereas inclusion of other important and reproducible items can be incorporated as demonstrated by the TCI Singapore instrument. The TCI Singapore instrument also has the advantage of incorporating a simple scoring system and criticality index to allow discrimination between ROs and comparisons against future College standards.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call