Web Survey Bibliography
When survey respondents are faced with long, cognitively demanding interviews, they may take measures to reduce their cognitive burden. In particular, when questions are presented in a predictable pattern, it is relatively easy for respondents to learn that certain behaviors will help them get to the end of the interview more quickly. One frequently-used pattern is to ask questions in an interleafed format, in which each question in a series of filter items is succeeded by follow-up questions only if the individual responds affirmatively to the filter. In such cases, respondents may learn that negative responses to filter questions help them end the interview more quickly. Such underreporting can lead to biased survey estimates.
This presentation will report on the results of two recent surveys of members of a probability-based web panel in which respondents were asked a series of advertising recall questions in an interleafed format. The order of the ads was randomly assigned for each respondent and recorded. Preliminary results of this study show that later placement in the order of presentation has a significant, negative impact on reported ad recall. This presentation will also discuss the effect of respondent demographics and interest in the survey topic on the extent of underreporting, as well as the impact of length of time spent in the panel. Implications for existing survey practice and directions for future research will be discussed.
Conference Homepage (abstract)
Web survey bibliography - European survey research associaton conference 2011, ESRA, Lausanne (35)
- Effects of speeding on satisficing in Mixed-Mode Surveys; 2011; Bathelt, S., Bauknecht, J.
- Quantifying Open-Ended Responses: Results from an Online Advertising Tracking Survey; 2011; Jacobe, A., Brewer, L., Vakalia, F., Turner, S., Marsh, S. M.
- Quality of responses to an open-ended question on a mixed-mode survey; 2011; Gibson, J., Vakalia, F., Turner, S.
- Open-ended questions in the context of temporary work research; 2011; Siponen, K.
- How do Respondents Perceive a Questionnaire? The Contribution of Open-ended Questions; 2011; Markou, E., Garnier, B.
- The Uses of Open-Ended Questions in Quantitative Surveys; 2011; Singer, E., Couper, M. P.
- Agree-Disagree Response Format versus Importance Judgment; 2011; Krebs, D.
- Testing a single mode vs a mixed mode design; 2011; Laaksonen, S.
- Germans' segregation preferences and immigrant group size: A factorial survey approach; 2011; Schlueter, E., Ullrich, J., Schmidt, P.
- Errors within web-based surveys: a comparison between two different tools for the analysis of tourist...; 2011; Polizzi, G., Oliveri, A. M.
- Benefits of Structured DDI Metadata across the Data Lifecycle: The STARDAT Project at the GESIS Data...; 2011; Linne, M., Brislinger, E., Zenk-Moeltgen, W.
- Microdata Information System MISSY; 2011; Bohr, J.,
- The Use of Structured Survey Instrument Metadata throughout the Data Lifecycle; 2011; Hansen, S. E.
- DDI and the Lifecycle of Longitudinal Surveys; 2011; Hoyle, L., Wackerow, J.
- Dissemination of survey (meta)data in the LISS data archive; 2011; Streefkerk, M., Elshout, S.
- Underreporting in Interleafed Questionnaires: Evidence from Two Web Surveys; 2011; Medway, R., Viera Jr., L., Turner, S., Marsh, S. M.
- The use of cognitive interviewing methods to evaluate mode effects in survey questions; 2011; Gray, M., Blake, M., Campanelli, P., Hope, S.
- Does the direction of Likert-type scales influence response behavior in web surveys?; 2011; Keusch, F.
- Cross-country Comparisons: Effects of Scale Type and Response Style Differences; 2011; Thomas, R. K.
- Explaining more variance with visual analogue scales: A Web experiment; 2011; Funke, F.
- A Comparison of Branching Response Formats with Single Response Formats; 2011; Thomas, R. K.
- Different functioning of rating scale formats – results from psychometric and physiological experiments...; 2011; Koller, M., Salzberger, T.
- Cognitive process in answering questions: Are verbal labels in rating scales attended to?; 2011; Menold, N., Kaczmirek, L., Lenzner, T.
- Experiments on the Design of the Left-Right Self-Assessment Scale; 2011; Zuell, C., Scholz, E., Behr, D.
- Separating selection from mode effects when switching from single (CATI) to mixed mode design (CATI /...; 2011; Carstensen, J., Kriwy, P., Krug, G., Lange, C.
- Testing between-mode measurement invariance under controlled selectivity conditions; 2011; Klausch, L. T.
- Using propensity score matching to separate mode- and selection effects; 2011; Lugtig, P. J., Lensvelt-Mulders, G. J.
- A mixed mode pilot on consumer barometer; 2011; Taskinen, P., Simpanen, M.
- Separation of selection bias and mode effect in mixed-mode survey – Application to the face-to...; 2011; Bayart, C., Bonnel, P.
- Optimization of dual frame telephone survey designs; 2011; Slavec, A., Vehovar, V.
- A Comparison of CAPI and PAPI through a Randomized Field Experiment; 2011; De Weerdt, J.
- Flexibility of Web Surveys: Probing 'do-not-know' over the Phone and on the Web; 2011; Hox, J., de Leeuw, E. D.
- Changing research methods in Ukraine: CATI or Mixed-Mode Surveys?; 2011; Paniotto, V., Kharchenko, N.
- The effects of mixed mode designs on simple and complex analyses; 2011; Martin, P., Lynn, P.
- Measurement Error in Mixed Mode Surveys: Mode or Question Format?; 2011; de Leeuw, E. D., Hox, J.