Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Nonresponse and measurement error in an online panel
Year 2014
Access date 30.06.2014
Abstract

Non-sampling errors, and in particular, those arising from non-response and the measurement process itself present a particular challenge to survey methodologists, because it is not always easy to disentangle their joint effects on the data. Given that factors influencing the decision to participate in a survey may also influence the respondents' motivation and ability to respond to the survey questions, variations in the quality of responses may simultaneously be caused by both non-response bias and measurement error. In this study, we examine factors underlying both kinds of error using data from the 2008 ANES Internet Panel. Using interview and paradata from the initial recruitment survey, we investigate the relationship between recruitment effort (e.g. number of contact attempts; use of refusal conversion efforts), willingness to participate in subsequent panel waves, and the ability and motivation to optimize during questionnaire completion. We find that respondents who were hardest to reach or persuade to participate in the recruitment interview responded to fewer monthly panel surveys overall, and were more likely to stop participating in the panel altogether. They also had higher rates of item non-response in the recruitment interview. Respondents who later stopped participating in the panel were also more likely to give answers of reduced quality in the wave 1 monthly survey (e.g. more midpoint answers, less differentiation between scale points for question batteries, and fewer responses to a check-all-that-apply question format). We then investigated two potential common causes of the observed relation between the propensity to stop participating in the panel and response quality (interest in computers, and ‘need to evaluate’), but neither one fully accounted for it. Interest in computers predicted later panel cooperativeness, while need to evaluate was related both to response quality and the propensity to attrit. Finally, we look at whether the panelists most likely to stop participating in the panel are also more likely to learn shortcutting strategies over time, to reduce the effort needed to complete monthly surveys. We find some support for this hypothesis. We discuss our findings and their implications for the design of future online panels.

Year of publication2014
Bibliographic typeBook section
Print

Web survey bibliography (4086)

Page:
Page: