Abstract

Objective To examine the psychometric properties of MEDICODE, a coding instrument developed to assess medication discussions during medical consultations. Methods Inter-coder reliability, test–retest stability, and concurrent validity with the Roter Interaction Analysis System (RIAS) and predictive validity with the Medical Interview Satisfaction Scale (MISS) were calculated. Results Inter-coder reliability and test–retest stability for medication class and status were both very good. Inter-coder agreement and test–retest stability for theme identification were mostly over 90%. Kappa values for theme identification varied from acceptable to excellent for 21 of the 29 and for 26 of the 37 Kappa coefficients that could be calculated. The mean percent agreement between MEDICODE and RIAS for medication class was of 96.8% and the mean Kappa value was 0.83. Although the mean percent agreement for the presence of a theme in MEDICODE and RIAS was 81%, the average Kappa coefficients were lower at 0.40. However, each of the four broad theme categories had its share of themes with robust Kappa values. We found significant positive correlations ( p < 0.05) between discussions of medication main effects and instructions with patient satisfaction. Conclusion With a reasonable amount of training, the coders were able to produce reliable and valid measures of discussions of medications during medical consultations. Practical implications MEDICODE will facilitate the study of the impact of the nature and intensity of discussions about medications during consultations on patient medication knowledge, medication recall and compliance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call