Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Use of Drag-and-Drop Rating Scales in Web Surveys and Its Effect on Survey Reports and Data Quality
Author Kunz, T.
Year 2013
Access date 31.05.2013
Abstract

In Web surveys, rating scales measuring respondents’ attitudes and behaviors by means of a series of related statements are commonly presented in grid formats. Besides benefits from using grid questions displaying multiple items neatly arranged and easy to complete on a single screen, grid formats often evoke satisficing behavior as respondents rush through a list of serial items quickly. This, in turn, might be at the expense of processing each item carefully resulting, amongst others, in less differentiated answers compared to using grids with fewer items or single-item per screen formats. The present experiment is designed to gain a better understanding of how respondents answer to rating scale questions and how the quality of rating scale answers can be influenced by different kinds of grid formats. For that purpose, two types of drag-and-drop rating scales are developed with the aim to retain the benefits of a grid format but preventing respondents from satisficing behaviors by either 1) dragging answer options horizontally arranged in the top row to the question items in the first column, or 2) dragging question items stacked in the top row to answer options in the first column. A 3 x 5 factorial design is implemented in a randomized field experimental Web survey conducted among university applicants (n=6000) with varying number of items (6, 10, and 16) presented in drag-and-drop formats or standard grids. Rating scale formats are examined in terms of response distribution and indicators of data quality (item nonresponse, nondifferentiation, acquiescence and extremity bias). Results indicate that while all rating scale formats yield comparable substantial responses drag-and-drop rating scales encourage higher item differentiation. However, results concerning other indicators of data quality are mixed which are discussed within the scope of the cognitive question-answer process.

Access/Direct link

Conference Homepage (abstract)

Year of publication2013
Bibliographic typeConferences, workshops, tutorials, presentations
Print

Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 68th Annual Conference, 2013 (88)

Page:
  • 1
  • 2
Page:
  • 1
  • 2