Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Instructions in Self-administered Survey Questions: Do They Improve Data Quality or Just Make the Questionnaire Longer?
Year 2016
Access date 09.06.2016
Abstract
Pre-testing techniques utilized in the development of production self-administered questionnaires, such as cognitive interviewing, often identify items where respondents misinterpret or are unclear about the meaning of terms in a question. Typically, this finding results in a recommendation to add instructions to an item, which has the detrimental effect of lengthening the questionnaire. Previous experimental research has shown that instructions have an effect on the estimates when the instructions counter theway many people naturally tend to think about a concept. For example, an instruction to exclude sneakers from a count of shoes will reduce the estimate of shoes because many respondents tend to think of sneakers as shoes. In addition, previous research has shown that instructions placed before questions are more effective than those placed after. However, few studies have looked empirically at whether or not instructions that are the product of actual production pre-testing techniques are similarly effective or useful, and worth the extra length they create. Nor have many other factors been examined that might influence the effectiveness of instructions. To examine these issues further, we report on an experiment that was administered to a nationally representative sample by web. Production questions and instructions were selected from a national teacher survey. In addition, questions and instructions were intentionally created to counter teachers’ natural conceptions of terms. These items were compared to a control group with no instructions. Utilizing a factorial experimental design, we also varied three factors that were predicted to alter the effectiveness of instructions: their location, format, and wording. Although the findings of this experiment are clearly generalizable to the web, arguably, these findings extend to mail surveys too.
 
 
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Print

Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 71st Annual Conference, 2016 (107)

Page:
Page: