Web Survey Bibliography
Relevance & Research Question
When designing questionnaires, an important decision to make is whether or not to include a ‘do-not-know-option’. In interviews this dilemma is solved by not explicitly offering ‘do-not-know’, but accepting it when it occurs. Interviewers are instructed to accept a non-substantive answer only after a gentle probe.
Online surveys, being self-administered, lack an interviewer. Therefore, web survey designers are hesitant to offer an explicit do-not-know option and ‘required answer’ is often default standard software. However survey methodologist strongly advice against this forced-answer strategy. Requiring an answer does not necessarily ensure that the right answer is given and may lead to irritation and more break-offs or to guessing and less valid answers, thereby reducing data quality.
Methods and Data
The data were collected among members of the LISS panel, a probability based panel of the Dutch population. The questionnaire contained questions, which in previous self-administered surveys showed a high percentage of item-nonresponse. A three by two experimental design was used. Factor A manipulated no explicit offering vs offering do-not-know in two different ways, visually separating do-not know and offering do-not know as a special button. Factor B manipulated accepting a do-not-know vs. only accepting it after a friendly probe. Respondents were randomly assigned to experimental conditions.
Results
We found clear effects of offering ‘do-not-know’ and of probing. Not explicitly offering do-not-know (but allowing to skip) followed by a friendly probe resulted in the lowest amount of missing information. Respondent evaluations showed that when do-not-know was offered explicitly the questions were experienced as less difficult. When a probe was offered, respondents indicated that the questions made them think more (about the topic). These results suggest that offering a d-not-know without probing gives respondents an easy escape, while probing stimulates the question-answer process. The scale reliabilities support this.
Added Value
This study adds an empirical basis to the debate on whether or not to offer do-not-know options in web surveys. We show that explicitly offering a do-not-know option in a web survey is not advisable. Allowing respondents to skip a question and programming in friendly probes is a good alternative.
GOR Homepage (abstract) / (presentation)
Web survey bibliography - General Online Research Conference (GOR) 2013 (34)
- Respondent Rewards: Money for Nothing?; 2013; Martin, P.
- Pros and cons of virtual interviewers – vote in the discussion about surveytainment; 2013; Póltorak, M., Kowalski, J.
- The fish model: What factors affect participants while filling in an online questionnaire?; 2013; Mohamed, B., Lorenz, A., Pscheida, D.
- Interview Duration in Web Surveys: Integrating Different Levels of Explanation; 2013; Rossmann, J., Gummer, T.
- The monetary value of good questionnaire design; 2013; Tress, F.
- Technical and methodological meta-information on current practices in online research: A full population...; 2013; Burger, C., Stieger, S.
- Using interactive feedback to enhance response quality in Web surveys. The case of open-ended questions...; 2013; Emde, M., Fuchs, M.
- Reducing Response Order Effects in Check-All-That-Apply Questions by Use of Dynamic Tooltip Instructions...; 2013; Kunz, T., Fuchs, M.
- Measuring wages via a volunteer web survey – a cross-national analysis of item nonresponse; 2013; Steinmetz, S., Annmaria, B.
- Does one really know?: Avoiding noninformative answers in a reliable way.; 2013; de Leeuw, E. D., Boevee, A., Hox, J.
- Sensitive Topics in PC and Mobile Web Surveys; 2013; Mavletova, A. M., Couper, M. P.
- Mobile Research Performance: How Mobile Respondents Differ from PC Users Concerning Interview Quality...; 2013; Schmidt, S., Wenzel, O.
- Who responds to website visitor satisfaction surveys?; 2013; Andreadis, I.
- Measuring working conditions in a volunteer web survey; 2013; de Pedraza, P., Villacampa, A.
- Sampling online communities: using triplets as basis for a (semi-) automated hyperlink web crawler.; 2013; Veny, Y.
- Why are you leaving me?? - Personality predictors of answering drop out in an online-study; 2013; Thielsch, M., Nestler, S., Back, M.
- Propensity Score Weighting – Can Personality Adjust for Selectivity?; 2013; Glantz, A., Greszki, R.
- Research Design as an Influencing Factor for Reliability in Online Market Research; 2013; Wengrzik, J., Theuner, G.
- Ethics, privacy and data security in web-based course evaluation; 2013; Salaschek, M., Meese, C., Thielsch, M.
- Seducing the respondent – how to optimise invitations in on-site online research?; 2013; Póltorak, M., Kowalski, J.
- Influence of mobile devices in online surveys; 2013; Maxl, E., Baumgartner, T.
- E-questionnaire in cross-sectional household surveys; 2013; Karaganis, M.
- GESIS Online Panel Pilot: Results from a Probability-Based Online Access Panel; 2013; Kaczmirek, L., Bandilla, W., Schaurer, I., Struminskaya, B., Weyandt, K.
- Online Survey – Research with children on advertising impact; 2013; Funkenweh, V., Busch, J., Amthor, A. L., Boeer, A., Gaedke, J.
- HTML5 and mobile Web surveys: A Web experiment on new input types; 2013; Funke, F.
- Metadata on the demographics of online research: Results from a full-range study of available online...; 2013; Burger, C., Stieger, S.
- How the screen-out influence the dropout of a commercial panel; 2013; Bartoli, B.
- Beyond methodology - some ethical implications of "doing research online"; 2013; Heise, N.
- Innovation in Data Collection: the Responsive Design Approach; 2013; Bianchi, A., Biffignandi, S.
- Break-off and attrition in the GIP amongst technologically experienced and inexperienced participants...; 2013; Blom, A. G., Bossert, D., Clark, V., Funke, F., Gebhard, F., Holthausen, A., Krieger, U., Wachenfeld...
- Nonresponse and Nonresponse Bias in a Probability-Based Internet Panel; 2013; Blom, A. G., Bossert, D., Funke, F., Gebhard, F., Holthausen, A., Krieger, U.
- Rewards - Money for Nothing?; 2013; Cape, P. J., Martin, P.
- Effects of incentive reduction after a series of higher incentive waves in a probability-based online...; 2013; Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla, W.
- Timing of Nonparticipation in an Online Panel: The effect of incentive strategies; 2013; Douhou, S., Scherpenzeel, A.