Abstract

ObjectiveTo compare the interrater and intergroup agreement in judging physician maloccurrence and compliance with standards of care using the implicit case review process. DesignMail survey with questionnaire. ParticipantsCase reviews and questionnaires were mailed to 140 board-certified ophthalmologists and 140 board-certified ophthalmologists with fellowship training. Main outcome measureAgreement judging maloccurrence and compliance with standard of care within each group and between general ophthalmologists and specialists. ResultsNinety-seven (35%) questionnaires were returned. Overall, 35% of respondents believed that ophthalmologists in the case reviews committed an error of either commission or omission. Forty-five percent of reviewers believed that physicians did not meet the standard of care. There was good within-group agreement for finding clinical error in management and not meeting the standard of care for all groups (kappa coefficient range: 0.55–0.83; P = < 0.004) except retina specialists (kappa coefficient = 0.12; P = 0.2). ConclusionsUnstructured implicit case review is not a reliable method for determining physician error or for measuring compliance with standards of care. The process is susceptible to bias, and results may vary with reviewer background or training. Unstructured implicit case review needs to be regarded as a rough screening tool and used accordingly.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.