Web Survey Bibliography
Relevance and Research Question:
It is well-documented that online surveys elicit higher reports of socially undesirable behavior than interviewer-administered surveys. However, there are possible exceptions, where the form of the question may inhibit the revelation of prejudicial attitudes. In research exploring race of interviewer effects, Krysan and Couper (2003) found some instances where white respondents (for example) gave more negative responses to interviewers than to computerized instruments. In qualitative debriefings, some respondents noted that talking to an interviewer gave them an opportunity to explain their choice of responses; in the CASI condition (as on the Web; see Krysan and Couper, 2005), they could only pick one of the response options, without the opportunity to justify their choice. We called this the “I’m not a racist, but…” phenomenon. An online experiment was designed to explore the hypothesis that, when given an opportunity to explain or clarify their answers, respondents will give more prejudicial responses.
Methods and Data:
Two experiments were embedded in the LISS online probability-based panel in the Netherlands. In both cases a set of 9 items on attitudes toward immigrants was asked. In the first study, conducted in August 2009 (n=4639), a random half received an open-ended question on a separate page following each closed question. In the second study, conducted in December 2010 (n=5328), for a random half of respondents, an optional open-ended comment appeared below each closed-ended question on the same page.
The results provide support for the hypothesis. In both cases, respondents given the open question gave significantly more prejudiced responses (F[1, 4352]=25.6, p<.001 for Exp. 1 and F[1, 5326]=7.1, p=0.008 for Exp. 2) than those getting only the closed-ended question. However, contrary to expectation, the effect was larger in experiment 1 than 2. We explore this finding in greater detail, examining both responses to individual items and those who made use of the text box to offer comments.
This study suggests value in giving respondents the opportunity to voice their opinions in their own words, rather than just requiring them to agree or disagree with one of the response options.
GOR Homepage (abstract) / (presentation)
Web Survey Bibliography - USA (2403)
- Educational use of smart phone technology: A survey of mobile phone application use by undergraduate...; 2013; Bomhold, C. R.
- Does website‐based information add any value in counseling mothers expecting a baby with severe...; 2013; Engels, A. C. et al.
- Middle School Sexual Harassment, Violence and Social Networks; 2013; Mumford, E. A. et al.
- Customer satisfaction in Web 2.0 and information technology development; 2013; Sharma, G., Baoku, L.
- To preserve or not to preserve; 2013; Mersereau, J. E. et al.
- Media Commons: the process and value of centralizing services; 2013; Mestre, L. S.
- Developing and completing a library mobile technology survey to create a user-centered mobile presence...; 2013; A., Bonadie-Joseph, I., Cain, J.Becker, D. A.
- Dialogistic Presence on Community College Websites in Nine Megastates; 2013; Shadinger, D.
- Foreign workers in Saudi Arabia: a field study of role ambiguity, identification, information-seeking...; 2013; Showail, S. J., McLean Parks, J., Smith, F. L.
- Should the third reminder be sent? The role of survey response timing on web survey results; 2013; Rao, K., Pennington, J.
- Assessing Nonresponse Bias in the Green Technologies and Practices Survey; 2013; Meekins, B., Sverchkov, M., Stang, S.
- Using an Item Response Theory Approach to Measure Survey Mode of Administration Effects: Analysis of...; 2013; Mariano, L. T., Elliott, M. N.
- How Does Online Survey Mode Affect Answers to Customer Feedback Loyalty Surveys?; 2013; Gupta, A., Lee, J.
- Using Internet Panel Surveys for Behavioral Health Surveillance; 2013; Gotway Crawford, C., Okoro, C. A., Dhingra, S., Akcin, H., Zhao, G., Ford, D., Pierannunzi, C.
- Model-Based Mode of Data Collection Switching from Internet to Mail in the American Community Survey; 2013; Chesnut, J.
- Estimating Mode Effects Without Bias: A Randomized Experiment to Compare Mode Effects Between Face-to...; 2013; Rivers, D., Vavreck, L.
- How Mobile Stacks Up to Traditional Online: A Comparison of Studies; 2013; Knowles, R.
- Why respondents suffer if you're not mobile-ready; 2013; Knapton, K.
- Thoughts on retrieving information from open-ended questions; 2013; Luyens, S.
- Social media data demands a marriage of high-tech and high-touch; 2013; Waldheim, C., Stevens, N.
- Book Review: Brand Together: How Co-creation Generates Innovation and Re-energizes Brands, by Nicholas...; 2013; Wilson, A.
- Digging deeper: using implicit tests to define consumers' semantic network; 2013; Riviere, P., Cuny, C., Allain, G., Vereijken, C.
- Conceptualising and evaluating experiences with brands on Facebook; 2013; Smith, S.
- Lotteries and study results in market research online panels; 2013; Goeritz, A.; Luthe, S. C.
- Estimates on the effectiveness of web application firewalls against targeted attacks; 2013; Holm, H., Ekstedt, M.
- Respondent Rewards: Money for Nothing?; 2013; Martin, P.
- How to make your questionnaire mobile-ready; 2013; Cape, P. J.
- Did I Do That? How Trap Questions Can Hurt Data Quality; 2013; Phillips, K.
- Leveraging mobile and online qualitative to get inside shoppers’ heads; 2013; Bryson, J., Ritzo, J.
- Utilization of High-Technology to Collect Health Risk Assessment Information from Medicare Members:...; 2013; Freedman, D., VanderHorst, N.
- Comparison of Smartphone and Online Computer Survey Administration; 2013; Wells, T., Bailey, J., Link, M. W.
- Attitudes of Nebraska Residents on Nebraska Water Management; 2013; Dillman, D. A., Edwards, M. L.
- Attitudes of Washington Residents on Washington Water Management; 2013; Dillman, D. A., Edwards, M. L.
- A Comparison of Data Quality Across Modes in a Mixed-Mode Collection of Administrative Records; 2013; Worthy, M., Mayclin, D.
- Reconceptualizing Survey Representativeness for Evaluating and Using Nonprobability Samples; 2013; Fan, D. P.
- To Click, Type, or Drag? Evaluating Speed of Survey Data Input Methods; 2013; Husser, J. A., Husser, J. A.
- Unit Nonresponse and Weighting Adjustments: A Critical Review; 2013; Brick, J. M.
- Measuring the impact of the Web: Rasch modelling for survey evaluation; 2013; Annoni, P., Weziak-Bialowolska, D., Farhan, H.
- Online Case Studies: HESI Exit Exam Scores and NCLEX-RN Outcomes; 2013; Young, A., Rose, G., Willson, P.
- The Challenges and Benefits of Distance Mentoring; 2013; Lach, H. W. et al
- Hail to Thee, Our Alma Mater: Alumni Role Identity and the Relationship to Institutional Support Behaviors...; 2013; McDearmon, J. T.
- Using Tablet Computers For “Intentional” Mobile Research; 2013; Seal, J.
- Encouraging Record Use For Financial Asset Questions In A Web Survey; 2013; Couper, M. P., Ofstedal, M. B., Lee, S.
- Experiments with methods to reduce attrition in longitudinal surveys; 2013; Fumagalli, L., Laurie, H., Lynn, P.
- How incentives affect web-based survey response rates of athletic program donors; 2013; Alvarado, G., Callison, C.
- Investigating the Relationship among Prepaid Token Incentives, Response Rates, and Nonresponse Bias...; 2013; Parsons, N. L., Manierre, M. J.
- The Effect of Survey Mode on High School Risk Behavior Data: a Comparison between Web and Paper-based...; 2013; Raghupathy, S., Hahn-Smith, S.
- Permission email messages significantly increase gambler retention; 2013; Jolley, W., Lee, A., Mizerski, R., Sadeque, S.
- How virtual corporate social responsibility dialogs generate value: A framework and propositions; 2013; Korschun, D., Du, S.
- Understanding service quality in a virtual travel community environment; 2013; Elliot, S., Li, G., Choi, C.