Web Survey Bibliography
In interview surveys, usually a 'do-not-know’-option is not explicitly offered to a respondent, but interviewers can accept it. It is good general practice to train interviewers in using a probe after an initial 'do-not-know’ to reduce item-nonresponse.
In web surveys designers are hesitant to offer an explicit do-not-know option for fear of encouraging respondents to choose this option as a quick answer. One the other hand, not accepting do-not-know, and issuing an error message insisting on an answer may lead to either irritation and more break-offs or to guessing and less valid answers.
In this study, the interactivity of the web is used to emulate friendly interviewer probing behaviour. A high quality, probability-based Internet panel (LISS-panel, Centerdata) was used for data collection. The questionnaire contained a series of questions, which in previous self-administered (mail and web) surveys showed a high percentage of item-nonresponse. A two by two experimental design was used: (1) explicit offering of do-not-know vs. no do-not-know option, and (2) directly accepting a do-not-know vs. only accepting it after a friendly probe. As baseline for comparison a fifth condition was added with the 'standard’ web option: an error message with no acceptance of continuation without an answer.
The number of resulting do-not-know answers in each condition are evaluated and compared with the results of a telephone survey on the same topic.
Conference Homepage (abstract)
Web survey bibliography - European survey research associaton conference 2011, ESRA, Lausanne (35)
- Effects of speeding on satisficing in Mixed-Mode Surveys; 2011; Bathelt, S., Bauknecht, J.
- Quantifying Open-Ended Responses: Results from an Online Advertising Tracking Survey; 2011; Jacobe, A., Brewer, L., Vakalia, F., Turner, S., Marsh, S. M.
- Quality of responses to an open-ended question on a mixed-mode survey; 2011; Gibson, J., Vakalia, F., Turner, S.
- Open-ended questions in the context of temporary work research; 2011; Siponen, K.
- How do Respondents Perceive a Questionnaire? The Contribution of Open-ended Questions; 2011; Markou, E., Garnier, B.
- The Uses of Open-Ended Questions in Quantitative Surveys; 2011; Singer, E., Couper, M. P.
- Agree-Disagree Response Format versus Importance Judgment; 2011; Krebs, D.
- Testing a single mode vs a mixed mode design; 2011; Laaksonen, S.
- Germans' segregation preferences and immigrant group size: A factorial survey approach; 2011; Schlueter, E., Ullrich, J., Schmidt, P.
- Errors within web-based surveys: a comparison between two different tools for the analysis of tourist...; 2011; Polizzi, G., Oliveri, A. M.
- Benefits of Structured DDI Metadata across the Data Lifecycle: The STARDAT Project at the GESIS Data...; 2011; Linne, M., Brislinger, E., Zenk-Moeltgen, W.
- Microdata Information System MISSY; 2011; Bohr, J.,
- The Use of Structured Survey Instrument Metadata throughout the Data Lifecycle; 2011; Hansen, S. E.
- DDI and the Lifecycle of Longitudinal Surveys; 2011; Hoyle, L., Wackerow, J.
- Dissemination of survey (meta)data in the LISS data archive; 2011; Streefkerk, M., Elshout, S.
- Underreporting in Interleafed Questionnaires: Evidence from Two Web Surveys; 2011; Medway, R., Viera Jr., L., Turner, S., Marsh, S. M.
- The use of cognitive interviewing methods to evaluate mode effects in survey questions; 2011; Gray, M., Blake, M., Campanelli, P., Hope, S.
- Does the direction of Likert-type scales influence response behavior in web surveys?; 2011; Keusch, F.
- Cross-country Comparisons: Effects of Scale Type and Response Style Differences; 2011; Thomas, R. K.
- Explaining more variance with visual analogue scales: A Web experiment; 2011; Funke, F.
- A Comparison of Branching Response Formats with Single Response Formats; 2011; Thomas, R. K.
- Different functioning of rating scale formats – results from psychometric and physiological experiments...; 2011; Koller, M., Salzberger, T.
- Cognitive process in answering questions: Are verbal labels in rating scales attended to?; 2011; Menold, N., Kaczmirek, L., Lenzner, T.
- Experiments on the Design of the Left-Right Self-Assessment Scale; 2011; Zuell, C., Scholz, E., Behr, D.
- Separating selection from mode effects when switching from single (CATI) to mixed mode design (CATI /...; 2011; Carstensen, J., Kriwy, P., Krug, G., Lange, C.
- Testing between-mode measurement invariance under controlled selectivity conditions; 2011; Klausch, L. T.
- Using propensity score matching to separate mode- and selection effects; 2011; Lugtig, P. J., Lensvelt-Mulders, G. J.
- A mixed mode pilot on consumer barometer; 2011; Taskinen, P., Simpanen, M.
- Separation of selection bias and mode effect in mixed-mode survey – Application to the face-to...; 2011; Bayart, C., Bonnel, P.
- Optimization of dual frame telephone survey designs; 2011; Slavec, A., Vehovar, V.
- A Comparison of CAPI and PAPI through a Randomized Field Experiment; 2011; De Weerdt, J.
- Flexibility of Web Surveys: Probing 'do-not-know' over the Phone and on the Web; 2011; Hox, J., de Leeuw, E. D.
- Changing research methods in Ukraine: CATI or Mixed-Mode Surveys?; 2011; Paniotto, V., Kharchenko, N.
- The effects of mixed mode designs on simple and complex analyses; 2011; Martin, P., Lynn, P.
- Measurement Error in Mixed Mode Surveys: Mode or Question Format?; 2011; de Leeuw, E. D., Hox, J.