Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Investigating Cognitive Effort of Response Formats in Web Surveys using Paradata
Year 2016
Access date 06.06.2016
Since Likert’s (1932) eminent article “A Technique for the Measurement of Attitudes”, the use of agree/disagree questions has become increasingly popular in empirical social research, because it appears to be possible to measure several constructs with a single response format. Fowler (1995), by contrast, argues that using construct-specific questions represents a simpler, more direct, and more informative method than using agree/disagree questions. Until now, this assumption has never been tested empirically. In this study, we investigate the cognitive effort of agree/disagree and construct-specific questions using paradata. By measuring response times, we are able to investigate the cognitive information processing of respondents during web surveys to determine the burden of these response formats. We applied an innovative two-stage outlier correction that is based on the activity of the web survey while processing, accompanied by an outlier definition that is based on the response time distributions. Moreover, computer mouse clicks were captured to evaluate the response times. We conducted an experimental study with four groups (n1 = 255; n2 = 237; n3 = 278; n4 = 235) that is based on an onomastic sampling approach. We tested agree/disagree and construct-specific questions, both presented individually as well as in grids. The results of our investigations indicate that the single construct-specific questions produce significantly higher response times than their agree/disagree counterparts. Hence, it is to be assumed that they imply a higher cognitive effort. However, there are no differences between the grids. In addition, the mouse clicks show no significant differences between the groups; thus, it is to be expected that they do not affect response times. Altogether, it appears that construct-specific questions require deeper cognitive information processing than agree/disagree questions if they are presented individually.
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations

Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 71st Annual Conference, 2016 (107)