AbstractThe scarce attention to the assessment and evaluation in science education research has been especially harmful for teaching science–technology–society (STS) issues, due to the dialectical, tentative, value‐laden, and polemic nature of most STS topics. This paper tackles the methodological difficulties of the instruments that monitor views related to STS topics and rationalizes a quantitative methodology and an analysis technique to improve the utility of an empirically developed multiple‐choice item pool, the Questionnaire of Opinions on STS. This methodology embraces an item‐scaling psychometrics based on the judgments by a panel of experts, a multiple response model, a scoring system, and the data analysis. The methodology finally produces normalized attitudinal indices that represent the respondent's reasoned beliefs toward STS statements, the respondent's position on an item that comprises several statements, or the respondent's position on an entire STS topic that encompasses a set of items. Some preliminary results show the methodology's ability to evaluate the STS attitudes in a qualitative and quantitative way and for statistical hypothesis testing. Lastly, some applications for teacher training and STS curriculum development in science classrooms are discussed. © 2006 Wiley Periodicals, Inc. Sci Ed 90:681–706, 2006
Read full abstract