Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Speeding and Non-Differentiation in Web Surveys: Evidence of Correlation and Strategies for Reduction
Author Zhang, Che.
Year 2013
Access date 31.05.2013

The interactivity of the Web can be harnessed to improve online response quality. A small body of research has begun to explore interactive prompts to reduce respondent satisficing, i.e., providing adequate but not optimal answers. For example, in our earlier work speeding (responding very quickly), is reduced with an interactive, textual prompt when responses are very fast (< 1/3 second per word). These and other studies have focused on one satisficing behavior, although it’s likely respondents who engage one satisficing behavior engage in other such behaviors while completing the questionnaire. In fact, emerging evidence suggests a strong correlation between two well-known satisficing behaviors in Web surveys—speeding and non-differentiation (giving very similar ratings in grid questions). Given that both speeding and non-differentiation are prominent satisficing behaviors, which one should be addressed through prompting and does prompting one behavior over the other differently impact data quality? We tested this in an experiment using a probability-based online panel. We compare two types of prompts in a series of grid questions, one targeting only speeding and the other only nondifferentiation (we also include a control condition of no prompt). We find that prompting either speeding or non-differentiation can curtail both behaviors on grid questions. This reflects the inherent correlation of these two satisficing behaviors, and more importantly, suggests that both prompts indeed lead to more thoughtful answers (in contrast to if the two types of prompts only had parallel effects where speeding prompts only reduced speeding and vice versa). In addition, both prompts seem to enhance the quality of answers other than grid questions, suggesting potentially broad effects on respondent performance. We will also report evidence about the impact of prompts on respondents’ behaviors in subsequent surveys of this panel, and whether any carry-over effects differ between the two types of prompts.

Access/Direct link

Conference Homepage (abstract)

Year of publication2013
Bibliographic typeConferences, workshops, tutorials, presentations

Web survey bibliography (183)