Abstract
To test whether conventional data reliability assessment overestimates reliability, an assessment and a comparison of the reliability of complex quality indicators and their simpler components were conducted. Medical records of 1078 Medicare cases with principal diagnoses of initial episodes of acute myocardial infarction (AMI) were independently reabstracted at two national Clinical Data Abstraction Centers (CDACs). The inter-rater agreement beyond chance (kappa) of reabstracted and original quality indicators and key components were computed and compared. Results showed excellent agreement (kappas ranging from 0.88 to 0.95) for simple determinations of whether standard medical therapies were provided. Repeatability of eligibility status and the more complex determinations of whether “ideal” candidates were not treated showed moderate to excellent kappa values ranging from 0.41 to 0.79. A planned comparison of five similar quality indicators and their key components showed that the simpler treatment components, as a group, had significantly higher kappas than the more complexly derived eligibility components and composite indicators (Fisher's exact, p < 0.02). Reliability assessment of quality indicators should be based upon the repeatability of the whole indicator, accounting for both data and logic, and not just one simple element.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.