Abstract
The objective of our study was to categorize radiologist peer review comments and evaluate their functions within the context of a comprehensive quality assurance (QA) program. All randomly entered radiology peer review comments at our institution were compiled over a 1-year period (January 1, 2011, through December 31, 2011). A Web-based commercially available software package was used to query the comments, which were then exported into a spreadsheet. Each comment was then placed into a single most appropriate category based on consensus decision of two board-certified pediatric radiologists. QA scores associated with each comment were recorded. A total of 427 peer review comments were evaluated. The majority of comments (85.9%) were entered voluntarily with QA scores of 1. A classification system was devised that augments traditional error classification. Seven broad comment categories were identified: errors of observation (25.5%), errors of interpretation (5.6%), inadequate patient data gathering (3.7%), errors of communication (9.6%), interobserver variability (21.3%), informational and educational feedback (23.0%), and complimentary (11.2%). Comment-enhanced peer review expands traditional diagnostic error classification, may identify errors that were underscored, provides continuous educational feedback for participants, and promotes a collegial environment.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.