Web Survey Bibliography
Recent research has shown that it is possible to improve coverage and reduce nonresponse by mixing web and mail data collection modes. Generally it is assumed that because both web and mail are visual modes, they will produce comparable data, but little empirical research has examined this assumption. Now that surveyors have the ability to relatively easily mix web and mail modes, we need to know whether or not measurement is, in fact, comparable. Open-ended questions (number boxes and text boxes) seem especially problematic because respondents often have full control over how they answer them; answers to these questions are not structured and guided in the same way as closed-ended questions with limited response options. This is especially true in self-administered surveys where there is no interviewer to probe, ensure that the desired type of answer is provided, or convert the respondent‘s answer into the desired format. In this paper, we examine item-nonresponse rates, response distributions, and the effects of questionnaire design features on a variety of open-ended questions from the Quality of Life in a Changing Nebraska (QLCN) survey. Where possible, we also examine the effects of design changes by subgroups (e.g., by respondents expected to be more or less familiar with each mode). The QLCN was conducted between July and September, 2009 (N=566; AAPOR RR1 = 46%) and contained eleven open-ended boxes where numeric information was requested and two open-ended boxes where descriptive text was requested. In addition to being randomly assigned to either the web or mail mode, respondents were randomly assigned to one of two questionnaire design treatments. Questionnaire design experiments include a question order experiment, small versus large box size on both numeric and text questions, and presence versus absence of answer box labels on numeric questions.
Conference Homepage (abstract)
Web survey bibliography - Olson, K. (13)
- The Effect of CATI Questions, Respondents, and Interviewers on Response Time; 2016; Olson, K.; Smyth, J. D.
- Identifying predictors of survey mode preference; 2015; Millar, M. M.; Olson, K.; Smyth, J. D.
- The Effect of Answering in a Preferred Versus a Non-Preferred Survey Mode on Measurement; 2014; Smyth, J. D., Olson, K., Kasabian, A.
- Assessing Within-Household Selection Methods in Household Mail Surveys; 2014; Olson, K., Stange, M., Smyth, J. D.
- Accuracy of Within-household Selection in Web and Mail Surveys of the General Population.; 2014; Olson, K., Smyth, J. D.
- Using Eye Tracking to Examine the Visual Design of Web Surveys; 2014; Zhou, Q., Ricci, K., Olson, K., Smyth, J. D.
- Analyzing Paradata to Investigate Measurement Error; 2013; Yan, T., Olson, K.
- Are You Seeing What I am Seeing? Exploring Response Option Visual Design Effects With Eye-Tracking; 2013; Libman, A., Smyth, J. D., Olson, K.
- Does Giving People Their Preferred Survey Mode Actually Increase Survey Participation Rates?; 2012; Olson, K., Smyth, J. D., Wood, H.
- Literacy and Data Quality in Self-Administered Surveys; 2011; Smyth, J. D., Olson, K.
- Medium Node: NSF Census Research Network; 2011; McCutcheon, A. L., Belli, R. F., Olson, K., Smyth, J. D., Soh, L.-K.
- Comparing Numeric and Text Open-End Responses in Mail and Web Surveys.; 2011; Olson, K., Smyth, J.
- When do nonresponse follow-ups improve or reduce data quality? A meta-analysis and review of the existing...; 2008; Olson, K., Feng, C., Witt, L.