Abstract

To assess the complementary natures of (a) a peer review ( PR peer review )-mandated database for physician review and discrepancy reporting and (b) a voluntary quality assurance ( QA quality assurance ) system for anecdotal reporting. This study was institutional review board approved and HIPAA compliant; informed consent was waived. Submissions to voluntary QA quality assurance and mandatory PR peer review databases were searched for obstetrics and gynecology-related keywords. Cases were graded independently by two radiologists, with final grades resolved via consensus. Errors were categorized as perceptional, interpretive, communication related, or procedural. Effect of errors was assessed in terms of clinical and radiologic follow-up. There were 185 and 64 cases with issues attributed to 32 and 27 radiologists in QA quality assurance and PR peer review databases, respectively; 23 and nine radiologists, respectively, had cases attributed to only them. Procedure-related entries were submitted almost exclusively through the QA quality assurance database (62 of 64 [97%]). In QA quality assurance and PR peer review databases, respectively, perceptional (47 of 185 [25%] and 27 of 64 [42%]) and interpretative (64 of 185 [34%] and 30 of 64 [47%]) issues constituted most errors. Most entries in both databases (104 of 185 [56%] in QA quality assurance and 49 of 64 [76%] in PR peer review ) were considered minor events: wording in the report, findings already known from patient history or prior imaging or concurrent follow-up imaging, or delay in diagnosing a benign finding. Databases had similar percentages of moderate events (28 of 185 [15%] in QA quality assurance and nine of 64 [14%] in PR peer review ), such as recommending unnecessary follow-up imaging or radiation exposure in pregnancy without knowing the patient was pregnant (nine of 64 [14%] in PR peer review and 28 of 185 [15%] in QA quality assurance ). The PR peer review database had fewer major events (one of 64 [1.6%]) than the QA quality assurance database (32 of 185 [17%]). The two quality improvement systems are complementary, with the QA quality assurance database yielding less frequent but more clinically important errors, while the PR peer review database serves to establish benchmarks for error rate in radiologists' performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.