Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Reducing the Threat of Sensitive Questions in Online Surveys
Year 2012
Access date 30.04.2012

pdf (164 kb)


Relevance and Research Question:
It is well-documented that online surveys elicit higher reports of socially undesirable behavior than interviewer-administered surveys. However, there are possible exceptions, where the form of the question may inhibit the revelation of prejudicial attitudes. In research exploring race of interviewer effects, Krysan and Couper (2003) found some instances where white respondents (for example) gave more negative responses to interviewers than to computerized instruments. In qualitative debriefings, some respondents noted that talking to an interviewer gave them an opportunity to explain their choice of responses; in the CASI condition (as on the Web; see Krysan and Couper, 2005), they could only pick one of the response options, without the opportunity to justify their choice. We called this the “I’m not a racist, but…” phenomenon. An online experiment was designed to explore the hypothesis that, when given an opportunity to explain or clarify their answers, respondents will give more prejudicial responses.
Methods and Data:
Two experiments were embedded in the LISS online probability-based panel in the Netherlands. In both cases a set of 9 items on attitudes toward immigrants was asked. In the first study, conducted in August 2009 (n=4639), a random half received an open-ended question on a separate page following each closed question. In the second study, conducted in December 2010 (n=5328), for a random half of respondents, an optional open-ended comment appeared below each closed-ended question on the same page.
The results provide support for the hypothesis. In both cases, respondents given the open question gave significantly more prejudiced responses (F[1, 4352]=25.6, p<.001 for Exp. 1 and F[1, 5326]=7.1, p=0.008 for Exp. 2) than those getting only the closed-ended question. However, contrary to expectation, the effect was larger in experiment 1 than 2. We explore this finding in greater detail, examining both responses to individual items and those who made use of the text box to offer comments.
Added Value:
This study suggests value in giving respondents the opportunity to voice their opinions in their own words, rather than just requiring them to agree or disagree with one of the response options.

Access/Direct link

GOR Homepage (abstract) / (presentation)

Year of publication2012
Bibliographic typeConferences, workshops, tutorials, presentations

Web survey bibliography - Couper, M. P. (93)

  • 1
  • 2
  • 1
  • 2