Web Survey Bibliography
Relevance & Research Question
When designing questionnaires, an important decision to make is whether or not to include a ‘do-not-know-option’. In interviews this dilemma is solved by not explicitly offering ‘do-not-know’, but accepting it when it occurs. Interviewers are instructed to accept a non-substantive answer only after a gentle probe.
Online surveys, being self-administered, lack an interviewer. Therefore, web survey designers are hesitant to offer an explicit do-not-know option and ‘required answer’ is often default standard software. However survey methodologist strongly advice against this forced-answer strategy. Requiring an answer does not necessarily ensure that the right answer is given and may lead to irritation and more break-offs or to guessing and less valid answers, thereby reducing data quality.
Methods and Data
The data were collected among members of the LISS panel, a probability based panel of the Dutch population. The questionnaire contained questions, which in previous self-administered surveys showed a high percentage of item-nonresponse. A three by two experimental design was used. Factor A manipulated no explicit offering vs offering do-not-know in two different ways, visually separating do-not know and offering do-not know as a special button. Factor B manipulated accepting a do-not-know vs. only accepting it after a friendly probe. Respondents were randomly assigned to experimental conditions.
Results
We found clear effects of offering ‘do-not-know’ and of probing. Not explicitly offering do-not-know (but allowing to skip) followed by a friendly probe resulted in the lowest amount of missing information. Respondent evaluations showed that when do-not-know was offered explicitly the questions were experienced as less difficult. When a probe was offered, respondents indicated that the questions made them think more (about the topic). These results suggest that offering a d-not-know without probing gives respondents an easy escape, while probing stimulates the question-answer process. The scale reliabilities support this.
Added Value
This study adds an empirical basis to the debate on whether or not to offer do-not-know options in web surveys. We show that explicitly offering a do-not-know option in a web survey is not advisable. Allowing respondents to skip a question and programming in friendly probes is a good alternative.
GOR Homepage (abstract) / (presentation)
Web survey bibliography - De Leeuw, E. D. (27)
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Mixed Mode Research: Issues in Design and Analysis; 2016; Hox, J.; De Leeuw, E. D.; Klausch, L. T.
- Dropouts in Longitudinal Surveys; 2016; Lugtig, P. J.; De Leeuw, E. D.
- Internet Panels, Professional Respondents, and Data Quality; 2015; Matthijsse, S.; De Leeuw, E. D.; Hox, J.
- Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face...; 2015; Struminskaya, B.; De Leeuw, E. D.; Kaczmirek, L.
- Pret met panels [Fun online]; 2013; Roberts, A., de Leeuw, E. D., Hox, J., Klausch, L. T., de Jongh, A.
- Leuker kunnen wij het wel maken. Online vragenlijst design: standaard matrix of scrollmatrix (We can...; 2013; Roberts, A., de Leeuw, E. D., Hox, J., Klausch, L. T., de Jongh, A.
- Internet Coverage and Coverage Bias in Europe: Developments Across Countries and Over Time; 2013; Mohorko, A., de Leeuw, E. D.,Hox, J.
- Random versus Systematic Error in a Mixed Mode Online-Telephone Survey; 2013; Hox, J., Scherpenzeel, A., Boeve, A., Boeve, A., de Leeuw, E. D.
- Mode Effects in Mixed-Mode Surveys: Prevention, Diagnostics, and Adjustment 1; 2013; de Leeuw, E. D., Dillman, D. A., Schouten, B.
- Does one really know?: Avoiding noninformative answers in a reliable way.; 2013; de Leeuw, E. D., Boevee, A., Hox, J.
- Counting and Measuring Online: The Quality of Internet Surveys; 2012; De Leeuw, E. D.
- Design of Web Questionnaires: Matrix Questions or Single Question Formats ; 2012; de Leeuw, E. D., Hox, J., Klausch, L. T., Roberts, A., de Jongh, A.
- Professional Respondents in Internet Panels: Who are They and What Do They Do to Our Data?; 2012; de Leeuw, E. D., Matthijsse, S.
- Question or Mode Effects in Mixed-Mode surveys: A Cross-cultural study in the Netherlands, Germany,...; 2012; de Leeuw, E. D., Nicolaas, G., Campanelli, P., Hox, J.
- Matrix vs. Single Question Formats in Web Surveys: Results from a large scale experiment; 2012; Klausch, L. T., de Leeuw, E. D., Hox, J., de Jongh, A., Roberts , A.
- Flexibility of Web Surveys: Probing 'do-not-know' over the Phone and on the Web; 2011; Hox, J., de Leeuw, E. D.
- Mode Effect or Question Wording? Measurement Error in Mixed Mode Surveys; 2011; de Leeuw, E. D., Hox, J., Scherpenzeel, A.
- Measurement Error in Mixed Mode Surveys: Mode or Question Format?; 2011; de Leeuw, E. D., Hox, J.
- Online Interviewing through Access Panel: Quantity and Quality Assurance; 2009; Petric, I., Appel, M., de Leeuw, E. D.
- Reducing Measurement Errors in Surveys; 2009; De Leeuw, E. D.
- Missing data; 2008; de Leeuw, E. D., Hox, J.
- The influence of advance letters on response in telephone surveys; 2007; de Leeuw, E. D., Callegaro, M., Hox, J., Korendijk, E., Lensvelt-Mulders, G. J.
- Have Telephone Surveys a Future in the 21-th century?; 2002; de Leeuw, E. D., Lepkowski, J. M., Kim, S.-W.
- Trends in household survey nonresponse: A longitudinal and international comparison; 2001; de Leeuw, E. D., de Heer, W.
- The effect of computer-assisted interviewing on data quality: A review.; 1995; de Leeuw, E. D., Hox, J., Snijkers, G.
- Data Quality in Mail, Telephone and Face to Face Surveys; 1992; De Leeuw, E. D.