Abstract

This research tests the widespread assumption that response effects due to variations in question form, wording, or context will be greatest among respondents who are least involved with an issue. A meta-analysis of results from 15 splitballot experiments conducted over a five-year period indicates that the response effects of using counterarguments or middle alternatives in survey questions are significantly larger, as would be expected, among respondents who are less involved with a given issue than among those who are highly involved with it. But the effects of question order and response order appear to be largely unrelated to how involved a respondent is with a particular issue. Issue involvement, then, appears to specify some response effects, but not others. Public opinion researchers have long assumed that the respondents who are most susceptible to being influenced by the way in which a question is worded, the form in which it is presented, or the order or context in which it is asked are those who are least involved with an issue and whose views on the subject are therefore not well crystallized or held with much conviction (see, e.g., Cantril, 1944; Converse, 1974; Payne, 1951). But a recent investigation by Krosnick and Schuman (1988), based on nearly thirty experiments conducted over a ten-year period in various national surveys, indicates that this widespread assumption about who it is that is most susceptible to response effects may very well be wrong (cf. Stember and Hyman, 1949-50; Sudman and Bradburn, 1974; Sudman and Swensen, 1985, cited in Krosnick and Schuman, for previous evidence on the hypothesis). With the notable exception of the effects of offering or omitting a middle response GEORGE F. BISHOP iS Professor of Political Science and a Senior Research Associate at the Institute for Policy Research at the University of Cincinnati. The research reported here was supported in part by grants from the National Science Foundation (SOC78-07407 and SES81-11404). The author would like to thank Jon Krosnick for his consultation on the meta-analysis. Public Opinion Quarterly Volume 54:209-218 ? 1990 by the American Association for Public Opinion Research Published by The University of Chicago Press / 0033-362X/90/0054-02/$2.50 This content downloaded from 157.55.39.35 on Mon, 29 Aug 2016 05:16:12 UTC All use subject to http://about.jstor.org/terms 210 George F. Bishop alternative in survey questions, they found no significant relationship between such indicators of as attitude importance, intensity, and certainty and an individual's susceptibility to different types of response effects. After considering alternative explanations for these counterintuitive findings, such as the possible unreliability, invalidity, or crudity of their attitude crystallization measures-all of which they find insufficient as rival hypotheses-Krosnick and Schuman have concluded that differences in attitude crystallization do not generally explain why some individuals are susceptible to response effects while others are not, and that each type of response effect is probably determined by a unique psychological moderating variable. An independent investigation by the author has produced a similar set of results from split-ballot experiments with different topics, conducted in telephone surveys over a five-year period in a major metropolitan area of the United States. Though the results presented here extend and replicate most of those reported by Krosnick and Schuman, there is evidence that issue involvement may specify some types of response effects, but not others.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call