Web Survey Bibliography
Relevance and Research Question:
It is well-documented that online surveys elicit higher reports of socially undesirable behavior than interviewer-administered surveys. However, there are possible exceptions, where the form of the question may inhibit the revelation of prejudicial attitudes. In research exploring race of interviewer effects, Krysan and Couper (2003) found some instances where white respondents (for example) gave more negative responses to interviewers than to computerized instruments. In qualitative debriefings, some respondents noted that talking to an interviewer gave them an opportunity to explain their choice of responses; in the CASI condition (as on the Web; see Krysan and Couper, 2005), they could only pick one of the response options, without the opportunity to justify their choice. We called this the “I’m not a racist, but…” phenomenon. An online experiment was designed to explore the hypothesis that, when given an opportunity to explain or clarify their answers, respondents will give more prejudicial responses.
Methods and Data:
Two experiments were embedded in the LISS online probability-based panel in the Netherlands. In both cases a set of 9 items on attitudes toward immigrants was asked. In the first study, conducted in August 2009 (n=4639), a random half received an open-ended question on a separate page following each closed question. In the second study, conducted in December 2010 (n=5328), for a random half of respondents, an optional open-ended comment appeared below each closed-ended question on the same page.
The results provide support for the hypothesis. In both cases, respondents given the open question gave significantly more prejudiced responses (F[1, 4352]=25.6, p<.001 for Exp. 1 and F[1, 5326]=7.1, p=0.008 for Exp. 2) than those getting only the closed-ended question. However, contrary to expectation, the effect was larger in experiment 1 than 2. We explore this finding in greater detail, examining both responses to individual items and those who made use of the text box to offer comments.
This study suggests value in giving respondents the opportunity to voice their opinions in their own words, rather than just requiring them to agree or disagree with one of the response options.
GOR Homepage (abstract) / (presentation)
Web Survey Bibliography (6390)
- How Representative are Google Consumer Surveys?: Results From an Analysis of a Google Consumer Survey...; 2013; Krishnamurty, P., Tanenbaum, E., Stern, M. J.
- One Drink or Two: Does Quantity Depicted in an Image Affect Web Survey Responses?; 2013; Charoenruk, N., Stange, M.
- A Comparison Between Screen/Follow Item Format and Yes/No Item Format on a Multi-Mode Federal Survey; 2013; Hernandez,S. J., Arakelyan, S. N., Welch, V. E.
- Using Multiple Modes in Follow-Up Contacts in Random-Digit Dialing Surveys; 2013; Chowdhury, P. P.
- Tablets and Smartphones and Netbooks, Oh My! Effects of Device Type on Respondent Behavior; 2013; Ross, H., Mendelson, J., Lackey, M.
- Impacts of Unit Nonresponse in a Recontact Study of Youth; 2013; Mendelson, J., Viera Jr., L.
- Multi-Mode Survey Administration: Does Offering Multiple Modes at Once Depress Response Rates?; 2013; Newsome, J., Levin, K., Langetieg, P., Vigil, M., Sebastiani, M.
- Responsive Design for Web Panel Data Collection; 2013; Bianchi, A., Biffignandi, S.
- Utilizing the Web in a Multi-Mode Survey; 2013; Venkataraman, L.
- Changing to a Mixed-Mode Design: The Role of Mode in Respondents’ Decisions About Participation...; 2013; Collins, D., Mitchell, M., Toomes, M.
- Comparing the Effects of Mode Design on Response Rate, Representativeness, and Cost Per Complete in...; 2013; Tully, R.
- Internet Response for the Decennial Census – 2012 National Census Test; 2013; Reiser, C.
- The Effects of Pushing Web in a Mixed-Mode Establishment Data Collection; 2013; Ellis, C.
- Using Web Ex to Conduct Usability Testing of an On-Line Survey Instrument; 2013; Stettler, K.
- Effects of Lotteries on Response Behavior in Online Panels; 2013; Goeritz, A., Luthe, S. C.
- Lotteries and study results in market research online panels; 2013; Goeritz, A., Luthe, S. C.
- The Effects of Errors in Paradata on Weighting Class Adjustments: A Simulation Study; 2013; West, B. T.
- Using Paradata to Study Response to Within-Survey Requests; 2013; Sakshaug, J. W.
- Paradata for Coverage Research ; 2013; Eckman, S.
- Improving Surveys with Paradata: Analytic Uses of Process Information; 2013; Kreuter, F.
- Theory of adaptation or survival of the fittest?; 2013; Cavallaro, K.
- Online Fundraising Essentials, Second Edition; 2013; Stevenson, S. C.
- Tips for Evaluating Online Effectiveness; 2013; Stevenson, S. C.
- The Digital Divide: The internet and social inequality in international perspective; 2013; Ragnedda, M., Muschert, G.
- Ten questions to ask your online survey provider; 2013; Williams, D.
- Survey quality prediction system 2.0; 2013
- Practical tools for designing and weighting survey samples; 2013; Valliant, R. L., Daver, J. A., Kreuter, F.
- Paradata in web surveys; 2013; Callegaro, M.
- Report Of The AAPOR Task Force On Non-probability sampling; 2013; Baker, R. P., Brick, J. M., Bates, N., Battaglia, M. P., Couper, M. P., Dever, J. A., Gile, K. J., Tourangeau...
- Incentive effects; 2013; Goeritz, A.
- A nationwide web-based freight data collection; 2013; Samimi, A., Mohammadian, A., Kawamura, K.
- Mode Matters: Evaluating Response Comparability in a Mixed-Mode Survey; 2013; Bowyer, B. T., Rogowski, J. C.
- Comparing Survey Results Obtained via Mobile Devices and Computers: An Experiment With a Mobile Web...; 2013; de Bruijne, M., Wijnant, A.
- Cognitive Probes in Web Surveys: On the Effect of Different Text Box Size and Probing Exposure on Response...; 2013; Behr, D., Bandilla, W., Kaczmirek, L., Braun, M.
- The E-Interview in Qualitative Research; 2013; Bampton, R., Cowton, C., Downs, Y.
- Methodological Considerations of Qualitative Email Interviews; 2013; Nehls, K.
- Best Practice in Online Survey Research with Sensitive Topics; 2013; Kays, K., Keith, T. L., Broughal, M. T.
- Research Intentions are Nothing without Technology: Mixed-Method Web Surveys and the Coberen Wall of...; 2013; Ganassali, S., Rodriguez-Santos, C.
- Reducing Response Burden for Enterprises Combining Methods for Data Collection on the Internet; 2013; Vik, T.
- Measuring Wages Worldwide: Exploring the Potentials and Constraints of Volunteer Web Surveys; 2013; Steinmetz, S., Raess, D., Tijdens, K., de Pedraza, P.
- Using Web Surveys for Psychology Experiments: A Case Study in New Media Technology for Research; 2013; Peden, B. F., Tiry , A. M.
- The Distinctiveness of Online Research: Descriptive Assemblages, Unobtrusiveness, and Novel Kinds of...; 2013; Lanfrey, D.
- Sampling, Channels, and Contact Strategies in Internet Survey; 2013; Macrì, E., Tessitore, C.
- Advancing Research Methods with New Technologies; 2013; Sappleton, N.
- Data Quality in PC and Mobile Web Surveys; 2013; Mavletova, A. M.
- PDAs in socio-economic surveys: instrument bias, surveyor bias or both?; 2013; Escobal, J., Benites, S.
- Virtual research assistants: Replacing human interviewers by automated avatars in virtual worlds; 2013; Hasler, B. S., Tuchman, P., Friedman, D.
- Compared to a small, supervised lab experiment, a large, unsupervised web-based experiment on a previously...; 2013; Ryan, R. S., Wilde, M., Crist, S.
- From mixed-mode to multiple devices. Web surveys, smartphone surveys and apps: has the respondent gone...; 2013; Callegaro, M.
- Moving an established survey online – or not?; 2013; Barber, T., Chilvers, D., Kaul, S.