Abstract

BackgroundThe surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. This study describes the results of a workstation-integrated peer review program in a busy outpatient radiology practice.MethodsWorkstation-based peer review was performed using the software program Intelerad Peer Review. Cases for review were randomly chosen from those being actively reported. If an appropriate prior study was available, and if the reviewing radiologist and the original interpreting radiologist had not exceeded review targets, the case was scored using the modified RADPEER system.ResultsThere were 2,241 cases randomly assigned for peer review. Of selected cases, 1,705 (76%) were interpreted. Reviewing radiologists agreed with prior reports in 99.1% of assessments. Positive feedback (score 0) was given in three cases (0.2%) and concordance (scores of 0 to 2) was assigned in 99.4%, similar to reported rates of 97.0% to 99.8%. Clinically significant discrepancies (scores of 3 or 4) were identified in 10 cases (0.6%). Eighty-eight percent of reviewed radiologists found the reviews worthwhile, 79% found scores appropriate, and 65% felt feedback was appropriate. Two-thirds of radiologists found case rounds discussing significant discrepancies to be valuable.ConclusionsThe workstation-based computerized peer review process used in this pilot project was seamlessly incorporated into the normal workday and met most criteria for an ideal peer review system. Clinically significant discrepancies were identified in 0.6% of cases, similar to published outcomes using the RADPEER system. Reviewed radiologists felt the process was worthwhile.

Highlights

  • The surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review

  • The median proportion of completed reviews per Radiologist characteristics and clinical setting A total of 10 radiologists participated as peer reviewers

  • Subspecialty accreditation in the imaging modalities relevant to this pilot study included ultrasonography (10/10), mammography (6/10), cardiac echocardiography (6/10), and nuclear medicine (1/10). Both the interpreting and reviewing radiologists were accredited in that subspecialty

Read more

Summary

Introduction

The surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. The definitive quality assessment approach in radiology is correlation of radiological findings with the ultimate clinical outcome [1,2,4,10]. By far the most-used peer review approach in North America is the RADPEER program of the American College of Radiology (ACR) [3,4]. In this scheme, members of participating radiology groups evaluate prior images and reports of cases currently being reported and rate the quality of the original interpretation [3,4]. More than 10,000 radiologists have participated in this program, representing about one-third of radiologists in the United States [3,4,17]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call