Web Survey Bibliography
Relevance & Research Question: When asking respondents questions with long lists of response options allowing multiple answers response order effects may occur. In Web surveys, primacy effects are to be expected due to their self-administered nature since the respondents’ attention declines when reading long lists of response options. Instructions asking respondents to process all response options to the same extent have only limited effect. Web surveys offer the opportunity to make use of dynamic tools that have the potential to increase the respondents’ attention to such instructions. Among others, tooltips are an effective method to direct the respondents’ attention to specific elements of a question, in particular to instructions.
Methods & Data: The effect of tooltip instructions was tested in a randomized 2x4 between-subjects experiment embedded in a survey among university applicants (n=6.000). Factor 1 included the order in which a long list of 13 response options was presented (original vs. reversed order). Factor 2 referred to four different ways of presenting the instruction (control condition without instruction, static instruction displayed underneath the question, dynamic tooltip instruction appearing each time a respondent hovered the mouse pointer over a response option, and a combined condition integrating both a static and a dynamic tooltip instruction). For each condition we computed response order effects and compared their magnitude in order to assess the effectiveness of the instruction as compared to the control condition.
Results: Findings indicate that the average response order effect as well as the number of significant effects can be reduced using dynamic tooltip instructions. The reduction is about the same as compared to the effect achieved by a static instruction. When combining the two means of presenting the instruction (static and dynamic) the reduction in response order effects is even more pronounced.
Added Value: The study demonstrates the benefits of using dynamic feedback to respondents in Web surveys. Using tooltip technology works as a continuous stimulus throughout the question-answer process. However, dynamic instructions are not a substitute for static instructions, instead, findings suggest a particular effectiveness of using dynamic instructions jointly with conventional instructions.
GOR Homepage (abstract) / (presentation)
Web survey bibliography - Kunz, T. (9)
- Dynamic Instructions in Check-All-That-Apply Questions ; 2016; Kunz, T.; Fuchs, M.
- The use and positioning of clarification features in web surveys; 2016; Metzler, A., Kunz, T., Fuchs, M.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- Instant Interactive Feedback in Grid Questions: Reminding Web Survey; 2014; Kunz, T., Fuchs, M.
- Using Eye Tracking Data to Understand Respondent's Processing of Rating Scales; 2013; Kunz, T., Fuchs, M.
- Use of Drag-and-Drop Rating Scales in Web Surveys and Its Effect on Survey Reports and Data Quality; 2013; Kunz, T.
- Reducing Response Order Effects in Check-All-That-Apply Questions by Use of Dynamic Tooltip Instructions...; 2013; Kunz, T., Fuchs, M.
- Effects of Static versus Dynamic Formatting Instructions for Open-Ended Numerical Questions in Web Surveys...; 2012; Kunz, T., Fuchs, M.
- Video enhanced web survey; 2011; Fuchs, M., Kunz, T., Gebhard, F.