Abstract
Abstract The type of response options selected for items on a survey, along with how many response options to include and whether to allow neutral midpoints, impacts data obtained from survey collections and the interpretations made using the results. Further, if subgroups within a population (e.g., racial/ethnic, gender, age) interpret response options differently, this variance can artificially inflate non-significant differences or mask true differences between groups. In this study, we apply two recursive partitioning procedures for investigating differential item functioning (DIF) in an experiment evaluating seven item response formats (five levels of an agree–disagree [AD] format and two levels of an item-specific [IS] format). Partial credit tree procedures allow for the evaluation of multiple covariates without prespecifying subgroups to be compared. We applied the procedures to items measuring adults’ attitudes toward legal abortion and all response formats functioned without DIF for age, gender, race, education, and religion when evaluated using global DIF screening approaches. Item-focused analyses indicated that odd-numbered response formats were less susceptible to content-based DIF. The combination of psychometric properties indicated that five-point AD and IS formats may be preferable for abortion attitude measurement based on the screening procedures conducted in this study.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.