Abstract

Ensuring patient safety and managing risk is a significant goal of radiation oncology centers. Comprehensive risk management tools for quality and patient safety in radiotherapy aren’t widely available nor are they sufficient, causing inadequate error reporting, compromised clinical operations, and noncompliance with regulatory requirements. This work examines the results of implementing a software-based, risk management program at multiple centers. The objective was to reduce overall errors, decrease costs, and improve efficiency throughout the entire treatment process. A software-based risk management program was implemented at radiation oncology centers A, B, and C. The program was used to identify, categorize, evaluate, and correct preventable systems-related errors. Errors were classified based on whether they occurred before or after treatment commenced. Error taxonomy included type, category, subcategory, and attribute. Failure Modes and Effects Analysis were performed and a risk priority number assigned all errors as a relative surrogate metric for risk. Action plans with root-cause analysis and reporting capability were interfaced with an internal email system for error review and approval. A total of 1,578 errors were identified at centers A, B, and C. The most common mistakes (209 or 13%) were failures to provide timely and accurate treatment planning, field verification, and CT simulation notes (before-during-after treatment). Next in error frequency were charge capture mistakes totaling 175 (11%). Then third, fourth, and fifth in frequency were errors in portal imaging at 161 (10%), record & verify data input at 92 (5.8%), and QA at 91 (5.8%), respectively. Treatment delivery errors at centers A, B, and C showed error rates of 0.32%, 3.2%, and 4.21% per patient; 0.01%, 0.11%, and 0.12% per fraction; and 0.001%, 0.001%, and 0.007% per field, respectively. Results from Huang and Marks showed per patient error rates of 1.97% and 1.2 – 4.7%, respectively. Results shown by Frass, French, Huang, and Marks, indicate per fraction error rates of 0.44%, 0.32%, 0.29%, and 0.5%, respectively. Lastly, results discussed by Frass, French, Macklis, Patton, and Margalit showed per field error rates of 0.13%, 0.037%, 0.18%, 0.17%, and 0.064%, respectively. Looking at error rates in the entire treatment process (patient registration thru completion of treatment) at centers A, B, and C, the combined pre- and post-treatment errors in this work showed an error rate per patient of 82%, 27%, and 99%; per fraction of 0.2.4%, 0.92%, and 2.8%; and 0.31%, 0.01%, and 0.17% per field, respectively. The direct costs of errors were calculated based on time and salary assumptions. The estimated total direct costs to mitigate errors at Centers A, B, and C were $489K, $714K, and $125K, respectively. A program designed to reduce errors in the overall treatment process can be used as an effective tool in managing risk, decreasing costs, and improving efficiency.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call