Web Survey Bibliography
Relevance & Research Question: Many studies found that the wording of a survey question can influence the answers that respondents provide. In particular, it has been shown that vague and ambiguous terms are often interpreted idiosyncratically by respondents, and thus can introduce a systematic bias into the survey data. In addition to ambiguity, the cognitive effort required to understand survey questions may affect data quality in a similar way. Earlier research identified several problematic text features (such as low-frequency words, left-embedded syntactic structures, low syntactic redundancy) that reduce question clarity and make survey questions difficult to comprehend (e.g. Lenzner, Kaczmirek, & Lenzner, 2010). This paper extends the earlier findings and examines whether the effort required to comprehend survey questions affects data quality.
Methods & Data: An experiment was carried out in which respondents were asked to complete two Web surveys (N1=825, N2=515) at a two-week interval. Approximately half of the respondents answered questionnaires that included unclear and less comprehensible questions, the other half received control questions that were easier to comprehend. Indicators of data quality were drop-out rates, number of non-substantive responses (“Don’t know’s”), number of neutral (midpoint) responses, and over-time consistency of responses across the two surveys. In addition, respondents’ verbal intelligence and motivation were assessed to examine whether question clarity effects were moderated by these two respondent characteristics.
Results: As expected, respondents receiving unclear questions provided lower-quality responses than respondents answering more comprehensible questions. Moreover, some of these effects were more pronounced among respondents with limited verbal skills and among respondents with low motivation to answer surveys.
Added value: These findings indicate that survey results can be systematically biased if questions are difficult to understand and exceed the processing effort that respondents are willing or able to invest. Making it easy for respondents to retrieve the meaning of a survey question seems to be an important requirement for obtaining high-quality answers.
Conference Homepage (abstract) / (presentation)
Web survey bibliography - Lenzner, T. (8)
- Incorporating eye tracking into cognitive interviewing to pretest survey questions; 2016; Neuert, C.; Lenzner, T.
- Investigating response order effects in web surveys using eye tracking; 2015; Karem Hoehne, J.; Lenzner, T.
- Does left still feel right? The optimal position of answer boxes in Web surveys - revisited; 2013; Lenzner, T., Kaczmirek, L.,Galesic, M.
- An Evaluation of Two Non-Reactive Web Questionnaire Pretesting Methods; 2012; Lenzner, T.
- Online Research @ GESIS; 2011; Kaczmirek, L., Lenzner, T.
- Effects of survey question clarity on data quality; 2011; Lenzner, T.
- Question Comprehensibility and Satisficing Behavior in Web Surveys; 2011; Lenzner, T.
- Cognitive process in answering questions: Are verbal labels in rating scales attended to?; 2011; Menold, N., Kaczmirek, L., Lenzner, T.