Web Survey Bibliography
Relevance and Research Question:
It is well-documented that online surveys elicit higher reports of socially undesirable behavior than interviewer-administered surveys. However, there are possible exceptions, where the form of the question may inhibit the revelation of prejudicial attitudes. In research exploring race of interviewer effects, Krysan and Couper (2003) found some instances where white respondents (for example) gave more negative responses to interviewers than to computerized instruments. In qualitative debriefings, some respondents noted that talking to an interviewer gave them an opportunity to explain their choice of responses; in the CASI condition (as on the Web; see Krysan and Couper, 2005), they could only pick one of the response options, without the opportunity to justify their choice. We called this the “I’m not a racist, but…” phenomenon. An online experiment was designed to explore the hypothesis that, when given an opportunity to explain or clarify their answers, respondents will give more prejudicial responses.
Methods and Data:
Two experiments were embedded in the LISS online probability-based panel in the Netherlands. In both cases a set of 9 items on attitudes toward immigrants was asked. In the first study, conducted in August 2009 (n=4639), a random half received an open-ended question on a separate page following each closed question. In the second study, conducted in December 2010 (n=5328), for a random half of respondents, an optional open-ended comment appeared below each closed-ended question on the same page.
Results:
The results provide support for the hypothesis. In both cases, respondents given the open question gave significantly more prejudiced responses (F[1, 4352]=25.6, p<.001 for Exp. 1 and F[1, 5326]=7.1, p=0.008 for Exp. 2) than those getting only the closed-ended question. However, contrary to expectation, the effect was larger in experiment 1 than 2. We explore this finding in greater detail, examining both responses to individual items and those who made use of the text box to offer comments.
Added Value:
This study suggests value in giving respondents the opportunity to voice their opinions in their own words, rather than just requiring them to agree or disagree with one of the response options.
GOR Homepage (abstract) / (presentation)
Web survey bibliography - General Online Research Conference (GOR) 2012 (26)
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.
- Exploring New Pathways to Survey Recruitment; 2012; Bilgram, V., Stadler, D.Jawecki, G.
- Understanding selection bias in a worldwide, volunteer web-survey; 2012; Tijdens, K., Steinmetz, S.
- Does Mode Matter? Initial Evidence from the German Longitudinal Election Study (GLES); 2012; Blumenstiel, J. E., Rossmann, J.
- The Representativity of Web Surveys of the General Population compared to Traditional Modes and Mixed...; 2012; Klausch, L. T., Schouten, B., Hox, J.
- Surveytainment 2.0: Why investing 10 more minutes more in constructing your questionnaire is worth considering...; 2012; Muehle, A., Tress, F., Schmidt, S., Winkler, T.
- Market research online community (MROC) versus focus group; 2012; Zuber, M.
- Data quality in MAWI and CAWI; 2012; Mavletova, A. M., Blasius, J.
- Time use data collection using Smartphones: Results of a pilot study among experienced and inexperienced...; 2012; Scherpenzeel, A., Sonck, N., Fernee, H., Morren, Me.
- Scrutinizing Dynamics – Rolling panel waves in theory and practice; 2012; Faas, T., Blumenberg, J. N.
- Little experience with technology as a cause of nonresponse in online surveys; 2012; Struminskaya, B., Schaurer, I., Kaczmirek, L., Bandilla, W.
- Automatic Forwarding on Web Surveys – Some Outlines and Remarks; 2012; Selkaelae, A.
- Thinking, Planning & Operationalizing Empirical Mixed Methods Research Design; 2012; Ruhi, U.
- Continuous large-scale volunteer web-surveys: The experience of Lohnspiegel and WageIndicator; 2012; Oez, F.
- Is Pretesting Established Among Online Survey Tool Users?; 2012
- An Evaluation of Two Non-Reactive Web Questionnaire Pretesting Methods; 2012; Lenzner, T.
- Recommendations for implementing online surveys and simple experiments in social and behavioural research...; 2012; Hewson, C. M.
- High potential for mobile Web surveys: Findings from a survey representative for German Internet users...; 2012; Funke, F., Wachenfeld, A.
- A taxonomy of paradata for web surveys and computer assisted self interviewing (Casi); 2012; Callegaro, M.
- Can Social Media Research replace traditional research methods?; 2012; Faber, T., Einhorn, M., Hofmann, O., Loeffler, M.
- Bad Boy Matrix Question – Whatcha gonna do when they come for you?; 2012; Tress, F.
- Matrix vs. Single Question Formats in Web Surveys: Results from a large scale experiment; 2012; Klausch, L. T., de Leeuw, E. D., Hox, J., de Jongh, A., Roberts , A.
- Effects of Static versus Dynamic Formatting Instructions for Open-Ended Numerical Questions in Web Surveys...; 2012; Kunz, T., Fuchs, M.
- FamilyVote – Conducting online surveys with children and families; 2012; Geissler, H., Peeters, H.
- The influence of social desirability on data quality in face-to-face and web surveys; 2012; Keusch, F.
- Reducing the Threat of Sensitive Questions in Online Surveys; 2012; Couper, M. P.