Abstract

The Optimality Index-US, a recently developed perinatal clinimetric index, has been validated with both clinical and research databases. Documentation of the reliability of the instrument for medical record abstraction is needed. This paper reports outcomes of interrater reliability assessments conducted for two projects. Abstraction was supervised by the same investigator, but staffed by different coders who had a variety of qualifications (perinatal nurse, nurse-midwife, clinical trial professional, student research assistants). Medical records were entirely paper at one site and partially electronic at another. Reliability (reproducibility) was assessed via percent agreement between pairs of coders on charts randomly selected for audits. Mean percentage agreement was 92.7% in both projects with a range from 89.1% to 97.8% in the first project, and a range from 88.5% to 96.2% in the second project. The sources of error differed between clinician and lay abstractors, but the number of errors did not differ. The average time per chart was assessed in the first project. Once proficiency was achieved, the average time needed to complete coding was 24 minutes, with some additional time needed for ordering paper charts. These analyses indicate that excellent reproducibility can be achieved with the Optimality Index-US.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call