Abstract

Analysis of extant clinical records is receiving increased emphasis in nursing investigations. Appropriate use of this approach to patient research requires careful attention to data management, including assessment of reliability. Percent agreement, phi, and Kappa all serve as estimates of interrater reliability in the analysis of data. Kappa has particular merit as a measure of interrater reliability; it also has some peculiar problems in implementation and interpretation. The nature and computation of Kappa and its application in analysis of clinical data are discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call