Abstract

Collapsing different category numbers within attitudinal measurement on learning and instruction can be an option of physics education research (PER) scholars while interpreting categorical responses from participants. Psychometric evaluation of the different category numbers, however, has inconclusive results to date. Item response theory (IRT) offers psychometric framework at the item level that has been approached in this study to explore psychometric properties of five different number of categories. One thousand artificial data within five number of categories were generated based on the underlying marginal distribution and the inter item correlation matrix. Statistical parameters [item estimates and test information functions (TIFs)] from Graded Response Model (GRM) were employed to describe the psychometric behaviors of the studied category numbers. Our findings demonstrated that the discrimination index among the five category numbers remained constant. Global fit indices had no distinguishable information to differentiate the varying category numbers. Based on the item location and TIF, our results confirmed previous works that more information will be raised by the greater number of categories. The implication intended by this study should offer a key recommendation for a case from attitudinal measurement on the PER community or even beyond the psychometric field.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.