Abstract

Many approaches in the item response theory (IRT) literature have incorporated response styles to control for potential biases. However, the specific assumptions about response styles are often not made explicit. Having integrated different IRT modeling variants into a superordinate framework, we highlighted assumptions and restrictions of the models (Henninger & Meiser, 2020). In this article, we show that based on the superordinate framework, we can estimate the different models as multidimensional extensions of the nominal response models in standard software environments. Furthermore, we illustrate the differences in estimated parameters, restrictions, and model fit of the IRT variants in a German Big Five standardization sample and show that psychometric models can be used to debias trait estimates. Based on this analysis, we suggest 2 novel modeling extensions that combine fixed and estimated scoring weights for response style dimensions, or explain discrimination parameters through item attributes. In summary, we highlight possibilities to estimate, apply, and extend psychometric modeling approaches for response styles in order to test hypotheses on response styles through model comparisons. (PsycInfo Database Record (c) 2020 APA, all rights reserved).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.