Abstract

The purpose of the study was to augment the previous CAHEA and AOTA Commission on Education validity and reliability studies with a reliability study focused on the accrediting activity processes within AOTA in conjunction with CAHEA. The three objectives of the study were to establish (1) the internal consistency of AC, RAE, and AMA raters; (2) interrater reliability among AC, RAE, and AMA raters; and (3) accuracy of AC, RAE, and AMA raters. Overall, the RAE raters were most internally consistent in their ratings (.94), followed by the AC (.93), and the AMA (.88). Because many of the RAE raters have participated recently in the new AOTA accreditation orientation workshop, it is not unusual that their ratings would have the least variability and be so closely matched to those of the AC raters. On the basis of these results, the raters in this study (44.5% of the total population of raters) could be expected to be highly consistent as they evaluate Essentials that are usually in compliance, many times out of compliance, and those in-between. Interrater reliability among all raters for all five Essentials and all four decision points was established at .93 with percent agreement with absolute concordance. Interrater reliability among AC and RAE raters only achieved .95 percent agreement. As more data were available (i.e., at Decision 3 and Decision 4), rater concordance increased. By Decision 4, the most critical decision point because these data constitute the Report of On-Site Evaluation (ROSE), percent agreement ranged from .83 to .96 for all three rating groups, and .86 to 1.00 for the AC and RAE raters.(ABSTRACT TRUNCATED AT 250 WORDS)

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call