Web Survey Bibliography
Non-sampling errors, and in particular, those arising from non-response and the measurement process itself present a particular challenge to survey methodologists, because it is not always easy to disentangle their joint effects on the data. Given that factors influencing the decision to participate in a survey may also influence the respondents' motivation and ability to respond to the survey questions, variations in the quality of responses may simultaneously be caused by both non-response bias and measurement error. In this study, we examine factors underlying both kinds of error using data from the 2008 ANES Internet Panel. Using interview and paradata from the initial recruitment survey, we investigate the relationship between recruitment effort (e.g. number of contact attempts; use of refusal conversion efforts), willingness to participate in subsequent panel waves, and the ability and motivation to optimize during questionnaire completion. We find that respondents who were hardest to reach or persuade to participate in the recruitment interview responded to fewer monthly panel surveys overall, and were more likely to stop participating in the panel altogether. They also had higher rates of item non-response in the recruitment interview. Respondents who later stopped participating in the panel were also more likely to give answers of reduced quality in the wave 1 monthly survey (e.g. more midpoint answers, less differentiation between scale points for question batteries, and fewer responses to a check-all-that-apply question format). We then investigated two potential common causes of the observed relation between the propensity to stop participating in the panel and response quality (interest in computers, and ‘need to evaluate’), but neither one fully accounted for it. Interest in computers predicted later panel cooperativeness, while need to evaluate was related both to response quality and the propensity to attrit. Finally, we look at whether the panelists most likely to stop participating in the panel are also more likely to learn shortcutting strategies over time, to reduce the effort needed to complete monthly surveys. We find some support for this hypothesis. We discuss our findings and their implications for the design of future online panels.
Web survey bibliography - Sturgis, P. (8)
- The Failure of the Polls: Lessons Learned from the 2015 UK Polling Disaster; 2017; Sturgis, P.
- Fieldwork Effort, Response Rate, and the Distribution of Survey Outcomes: A Multilevel Meta-analysis; 2016; Sturgis, P.; Williams, Jo.; Brunton-Smith, I.; Moore, J.
- Report of the Inquiry into the 2015 British general election opinion polls; 2016; Sturgis, P., Baker, N., Callegaro, M., Fisher, St., Green, J., Jennings, W., Kuha, J., Lauderdale, B...
- The Cathie Marsh lecture: What does the failure of the polls tell us about the future of survey research...; 2015; Sturgis, P., Matheson, J.
- Nonresponse and measurement error in an online panel; 2014; Roberts, C., Allum, N., Sturgis, P.
- Improving Survey Methods: Lessons from Recent Research; 2014; Engel, U., Jann, B., Lynn, P., Scherpenzeel, A., Sturgis, P.
- Advanced Research Methods Training in the UK: Current Provision and Future Strategies; 2013; Moley, S., Wiles, R., Sturgis, P.
- Attitudes over time: The psychology of panel conditioning; 2009; Sturgis, P., Allum, N., Brunton-Smith, I.