Web Survey Bibliography
The popularity of mixed mode survey designs has led to an increased interest in mode preference. Previous research has shown that offering respondents their preferred mode can increase response rates (Olson, Smyth and Wood 2009), but the effect of mode preference on the quality of survey measurement is still unexplored. In this paper, we examine a variety of experimental questionnaire design manipulations, evaluating whether the quality of data from those who received their preferred mode is better than data quality for those who did not receive their preferred mode. Respondents were asked about their preferred mode and willingness to be recontacted in a 2008 survey, and those that agreed were surveyed again in 2009. The 2009 Quality of Life in a Changing Nebraska survey (n=565, AAPORRR2=46%) randomly assigned respondents to mail or web modes, and one of two questionnaires. The two questionnaires systematically varied many elements of questionnaire design (i.e. question order, text box labels, forced choice vs. check all that apply). Almost one quarter (24%) of the respondents were matched with their stated preferred mode from the previous year. Preliminary analyses indicate significant differences in data quality between those who received and did not receive their preferred mode. In particular, respondents who received their preferred mode had few significant differences in the rate of endorsement of items in a check-all versus forced choice format. In contrast, respondents who did not receive their preferred mode were more likely to endorse statements when presented with a forced choice format than with a check–all format, consistent with previous literature even after controlling for respondent characteristics. These findings may indicate better cognitive processing in a check-all format when respondents receive the mode they prefer. Results provide insight into the impact of mode preference on commonly used questionnaire design features.
Conference Homepage (abstract)
Web survey bibliography - Olson, K. (13)
- The Effect of CATI Questions, Respondents, and Interviewers on Response Time; 2016; Olson, K.; Smyth, J. D.
- Identifying predictors of survey mode preference; 2015; Millar, M. M.; Olson, K.; Smyth, J. D.
- The Effect of Answering in a Preferred Versus a Non-Preferred Survey Mode on Measurement; 2014; Smyth, J. D., Olson, K., Kasabian, A.
- Assessing Within-Household Selection Methods in Household Mail Surveys; 2014; Olson, K., Stange, M., Smyth, J. D.
- Accuracy of Within-household Selection in Web and Mail Surveys of the General Population.; 2014; Olson, K., Smyth, J. D.
- Using Eye Tracking to Examine the Visual Design of Web Surveys; 2014; Zhou, Q., Ricci, K., Olson, K., Smyth, J. D.
- Analyzing Paradata to Investigate Measurement Error; 2013; Yan, T., Olson, K.
- Are You Seeing What I am Seeing? Exploring Response Option Visual Design Effects With Eye-Tracking; 2013; Libman, A., Smyth, J. D., Olson, K.
- Does Giving People Their Preferred Survey Mode Actually Increase Survey Participation Rates?; 2012; Olson, K., Smyth, J. D., Wood, H.
- Literacy and Data Quality in Self-Administered Surveys; 2011; Smyth, J. D., Olson, K.
- Medium Node: NSF Census Research Network; 2011; McCutcheon, A. L., Belli, R. F., Olson, K., Smyth, J. D., Soh, L.-K.
- Comparing Numeric and Text Open-End Responses in Mail and Web Surveys.; 2011; Olson, K., Smyth, J.
- When do nonresponse follow-ups improve or reduce data quality? A meta-analysis and review of the existing...; 2008; Olson, K., Feng, C., Witt, L.